Manon den Dunnen

Activity

  • 39
    Updates
  • 16
    Thumbs up
  • 2
    Comments
Manon den Dunnen, Strategisch specialist digitaal , posted

Running Large Language Models locally 3/3

Featured image

In a series of 3 workshops we want to help you in setting up Ollama and running your local LLM’s. Being able to run a Large Language Model locally has a lot of advantages, next to not paying for a pro plan or API costs, it also means not sharing your chat data.

Thanks to recent developments ('quantization') we now have models like Mistral 8x7B that run on your laptop! There are also many products that support you in running, creating and sharing LLM’s locally with a command line, like the open source app Ollama.

Workshop 1/3 (April 17th); getting started
Workshop 2/3 (May 15th); making the most of Ollama on a variety of devices
Workshop 3/3 (June 19th); finetuning your LLM
https://www.meetup.com/sensemakersams/events/299714048/

Manon den Dunnen's picture Meet-up on Jun 19th
Manon den Dunnen, Strategisch specialist digitaal , posted

Running Large Language Models locally 2/3

Featured image

In a series of 3 workshops we want to help you in setting up Ollama and running your local LLM’s. Being able to run a Large Language Model locally has a lot of advantages, next to not paying for a pro plan or API costs, it also means not sharing your chat data.

Thanks to recent developments ('quantization') we now have models like Mistral 8x7B that run on your laptop! There are also many products that support you in running, creating and sharing LLM’s locally with a command line, like the open source app Ollama.

Workshop 1/3 (April 17th); getting started
Workshop 2/3 (May 15th); making the most of Ollama on a variety of devices
Workshop 3/3 (June 19th); finetuning your LLM

Manon den Dunnen's picture Meet-up on May 15th
Manon den Dunnen, Strategisch specialist digitaal , posted

Ollama & running Large Language Models locally

Featured image

In a series of 3 workshops we want to help you in setting up Ollama and running your local LLM’s. Being able to run a Large Language Model locally has a lot of advantages, next to not paying for a pro plan or API costs, it also means not sharing your chat data.

Thanks to recent developments ('quantization') we now have models like Mistral 8x7B that run on your laptop! There are also many products that support you in running, creating and sharing LLM’s locally with a command line, like the open source app Ollama.

Workshop 1/3 (April 17th, during Appril Festival); getting started
Workshop 2/3 (May 15th); making the most of Ollama on a variety of devices
Workshop 3/3 (June 19th); finetuning your LLM

Manon den Dunnen's picture Meet-up on Apr 17th
Manon den Dunnen, Strategisch specialist digitaal , posted

Free Arduinoworkshop- part 1

Featured image

From Januari - April, every 3th Wednesday their will be a course (workshop) to learn about arduino, electronics and creating your own products/solutions with it! The workshop starts at exactly 19h, be sure to have arduino downloaded on your computer.

But you can also work on your own projects with likeminded and discuss your challenges, or learn to work with Raspberry Pi or work on a prototype to 3Dprint etc. Manuals are available (but also on Youtube) and we have a lot of fun little projects to work on!

Manon den Dunnen's picture Meet-up on Jan 18th
Manon den Dunnen, Strategisch specialist digitaal , posted

X-mas makersnight special

Featured image

This evening we will have some Xmas-ideas and materials for you to work on, as to not only (learn to) use the machines & sensors, but also take home a souvenir (it would help if you already have Inkscape installed if you want to make the xmas tree decorations with your name in it:-)

Of course you can also work on your own projects with likeminded and discuss your challenges, or learn to work with Raspberry Pi or work on a prototype to 3Dprint etc. Manuals are available (but also on Youtube) and we have a lot of fun little projects to work on!

Manon den Dunnen's picture Meet-up on Dec 21st