Overview
AskAI.jl, as its name suggests, is a straightforward tool for querying Large Language Models. Currently supporting only Google's Gemini model due to its free, though rate-limited, API, it's designed to be simple and direct: send prompts and questions to Gemini, and optionally execute the included code within a sandboxed "playground" to avoid affecting the main scope.
The main macro, @ai
, retrieves results from a large language model (current the Gemini, more model supported soon), while @AI
executes the code within the "playground" scope and displays the output(or any errors.)
a REPL mode was also support. Press }
to enter and backspace to exit
as most of the AI tool, it needs the api key, for the Gemini key you can apply from Google Gemini. and the set it in the ENV["AI_API_KEY"]
or use AskAI.setapi()
to replace new key. please also add a module called playground: module playground end
in your main scope for the code execute.
A convenient way is to put below code in your Julia startup.jl
configuration file.
ENV["AI_API_KEY"] = "your_key_for_Google_Gemini"
3 | module playground end
4 | using AskAI
then you can use the AskAI in every session by default
it starts as my persional AI tool in julia REP and only support the Gemini model currently. have fun with it and I welcome your suggestions and input for AskAI.jl!!!
quickly example
Here's an example of using AskAI to generate scatter and histogram plots and perform basic statistical calculations.
@AI "tell me the current date, use the pacakge when in need"
5 | @AI "create a new project in /tmp, name it as demo + date, activate it"
6 | @AI "load my data as df, the data file is in /tmp/celldata.csv"
7 | @AI "tell me the data size"
8 | @AI "does the data contain columns named geneX and geneY??"
9 | @AI "install the package to support figure display in the terminal"
10 | @AI "plot a scatter plot of geneX and geneY, I want the geneX on axis Y"
11 | @AI "please also label the axis"
12 | @AI "calculate the correlation of geneX and geneY"
13 | @AI "keep only 3 digits"
14 | @AI "generate a histogram to show the distribution of geneX "
15 | @AI "do the same to geneY"
16 | @AI "fit a linear model to predict value of geneY from geneX,using GLM"
17 | @AI "give me the coef of geneX in this model,keep 5 digits"
Output Results are here
Details
some time you may get the wrong result from the LLM, LLM results aren't always perfect, so please double-check. You can use @ai
instead of @AI
for code checks. then use exe()
to perform the code. Often the case I met is the necessary packages aren't installed.
@ai "tell me the current date,install the package if it needs"
18 | AskAI.exe(ans)
to review the conversation history
AskAI.Brain.history["ask"]
19 | AskAI.Brain.history["ans"]
20 |
21 | # review the last response
22 | AskAI.Brain.history["ans"][end] |> AskAI.MD
you can also try the stream mode under terminal
AskAI.Brain.stream = true
23 | @ai "why the sky is blue"
function and macro
AskAI.exe
— Methodexecute the string as code
"1 + 1" |> AskAI.exe
24 |
25 | (@ai "1 + 1") |> AskAI.exe
AskAI.reset
— MethodReset the AskAI, it will remove the conversation history, memory, prompt... and everything as default defined
AskAI.reset()
AskAI.setapi
— Methodset the API_KEY
setapi("1234567890abcdef1234567890abcdef")
AskAI.@AI
— Macrosimilar to @ai
,
send the question to AI but @AI perform the code directly and only return the result, or error :(
the conversation history will stored in the AskAI.Brain.history
example:
@AI "tell me the current time, used the package you need"
AskAI.@ai
— Macroget the answer from the AI
example
@ai "fit a linear model"
26 | # or you can concatenate your question
27 | @ai "fit a" + "linear model"
AskAI.checkMemory!
— Functionoptimalize the memory text
, when the memory words length exceeds 3000 words, summary it into 300 words
AskAI.showStreamStringFromChannel
— Methodfor stream mode, displays a streaming response from the channel, updating the display in terminal with each chunk of text received.
AskAI.streamToMemory
— Methodfor stream mode, take string from the channel, convert to markdown, and save as memory
context