Jason Adika Tanuwijaya
Problem Statement :
Install the Ollama software, pull the Q4_K_M quantization of llama3.2:3b-instruct and ask it a question that could be answered by reading Writing R Extensions.
Solution :
- Download ollama in your pc using this link.
- Run the .exe file
- Open your terminal and check if Ollama is install in your pc
ollama --version
- Pull the model that are avaiable in ollama that you want to use in this test we will use llama3.2:3b-instruct-Q4_K_M.
ollama pull llama3.2:3b-instruct-Q4_K_M
- Run the model and ask it question.
ollama run llama3.2:3b-instruct-Q4_K_M <Your Prompt>
Testing can be see in Easy_Test.txt
Problem Statement:
Use the ellmer package to perform the easy task above but programmatically.
Solution
- For this test we need to use ellmer package in R
- The fucntion that we will use from this package is
chat_ollama()
chat_ollama( system_prompt = NULL, turns = NULL, base_url = "http://localhost:11434", model, seed = NULL, api_args = list(), echo = NULL )
Full Solution in Medium_Test.R
Problem Statement :
Create an R package that depends on ellmer and provides a function that takes the name of a package returns a character vector of functions exported by that package by only using ellmer and llama3.2:3b from Ollama. The list of functions does not need to be correct (making it correct is the whole point of this project)
Solution :
Use the function from Medium_Test.R, improve it using prompt engineering to answer the problem and put it on a R package
Full Solution in ChatRhardtest.R