Reimagining Tooling as Coding (RTaC) Framework
Objective: Create a low-latency tool-use LLM which matches closed-source LLMs in performance while being cost-effective
- Created RTaC framework to convert tools to python functions, promoting docstring-reading capabilities in coding-base LLMs
- Utilized PEFT to finetune Code Llama 7B & DeepSeek 1.3B on 2500 examples of manually cleaned GPT generated synthetic data
- Achieved a competitive performance to GPT-4 at 20% of the cost, supporting dynamic tool addition & mathematical/iterative logic