AI Assistant
Welcome! In this guide, you'll discover an AI chatbot that can interact with the NEAR ecosystem
This AI agent can:
- Explore and explain what happened in a transaction when given a transaction hash
- Request tokens from the testnet faucet
- Mint and send a special NFT though a wallet it controls to a user
- Answer general questions about the NEAR architecture (powered by realtime search results)
Created by our community member Reza, this project was one of our AI track winners at the ETHGlobal Brussels 2024 hackathon
Prerequisites​
Before starting, make sure you have the following tools installed:
Then we need to run our AI model locally. Here we'll be using llama.cpp with Nous Hermes 2 Pro as the model.
Below are the steps to setup it at the time of writing, but please refer to the llama.cpp repository for up to date instructions:
- Install llama.cpp with the method of your choice, we'll use brew here:
brew install llama.cpp
- Clone the model's repository by following the instructions on Hugging Face:
# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install
git clone git@hf.co:NousResearch/Hermes-2-Pro-Llama-3-8B
- Back to llama.cpp, if you didn't already to it clone the repository and generate the GGUF file needed to run the model with llama.cpp:
git clone git@github.com:ggerganov/llama.cpp.git
cd llama.cpp
# Setup the environment and run the conversion script
python -m venv venv
python -m pip install -r requirements.txt
python convert_hf_to_gguf.py <path_to>/Hermes-2-Pro-Llama-3-8B/
- You should end up with a
hermes-2-pro-llama-3-8B-DPO-F16.gguf
file inside theHermes-2-Pro-Llama-3-8B
repository. Finally, let's run the llama.cpp server with it:
llama-server -m <path_to>/hermes-2-pro-llama-3-8B-DPO-F16.gguf
Open your browser at http://localhost:8080
, if you see an interface similar to this one you are ready to go 🚀
Make sure the model.api_url
in ai/config/general
is set to http://localhost:8080/completion
to use your model running locally 😉
You can use a different model with llama.cpp if you wish! Just make sure:
- It supports function calling
- Update the
model.max_prompt_tokens
config according to the context length of the new model - Update the ChatML config variables to match those of the new model
Setup and architecture​
Start by cloning the repository of the project:
git clone git@github.com:RezaRahemtola/near-ai-assistant.git
You'll find 2 folders in it, ai
and front
.
AI​
Let's start by configuring a virtual environment to install the dependencies:
cd ai/
python -m venv venv
python -m pip install poetry
poetry install
Then you can create a .env
file and fill it with values inspired from the .env.example
file:
OXYLABS_USERNAME
andOXYLABS_PASSWORD
are API credential used to access an SERP API to search information on GoogleNEAR_ACCOUNT_ID
andNEAR_ACCOUNT_PRIVATE_KEY
are used by the AI to control a wallet and send transactions from itNEAR_RPC_URL
can also be set in case you want to use a different RPC
Once you've done all this, you can activate your virtual environment and launch the code 🚀
source venv/bin/activate
python src/main.py
Frontend​
Now that your AI agent is ready to go, let's quickly launch a basic frontend to interact with it:
Install the dependencies:
cd front/
yarn
And launch it:
yarn dev
Usage​
You can now head to http://localhost:5173
, where you'll find an interface like this one to interact with the AI:
Here are a few example questions you can ask it:
- What is NEAR?
- What are the different transaction actions on NEAR?
- Can I please have an ETHGlobal Brussels NFT sent to me at random.testnet? Thanks
- I want to start using NEAR, can you send me some tokens on my testnet address random.testnet?
- I don't understand what this transaction is doing, can you help me? The transaction hash is
hash
and it was send by someone.testnet.
Moving Forward​
That's it for the quickstart tutorial. You have now seen an open-source AI agent interacting with NEAR and controlling a wallet to make transactions.
To better understand how it works, check the agent.py
file and the Function Calling explanation on HuggingFace.
A lot of things could be built by leveraging this PoC, some ideas could be:
- A chatbot integrated to the explorer to summarize transactions directly in the page
- Interactive tutorials in the documentation through a chatbot
- "How to create an NFT?"
- Multiple chats with explanations and code to complete given by the AI
- In the end, the AI publish the smart contract, mint an NFT and send it to you on testnet
- An AI trader reacting to on-chain or off-chain events to buy/sell some tokens with its wallet
- The only limit is your imagination!
Happy coding! 🚀