Meet Laragenie: The AI "Colleague" That Understands Your Codebases

Ever wondered what it would be like to have instant, contextual answers to questions about projects, without having to wait on others for responses?

Perhaps you want to onboard members of the team, but don't have a budget that covers the full cost of the pairing journey over a period of days, weeks or months.

Maybe you're a solo developer that wants to get a second opinion on a feature from someone that knows the project as well as you do, but you don't want to share the context with strangers online.

If you're a Laravel developer you're in luck…

Meet Laragenie: The AI colleague made for the command line, ready to revolutionise your coding experience. After a few simple setup steps, you'll transform your projects to a level that was previously impossible. Onboarding, developer assistance and daily support has never been so easy.

Don't fret if you're not a Laravel dev either. Even though Laragenie is a tool that runs on a Laravel app, you can still use it to integrate with any project in any language.

Laragenie example gif


Let's get setup

This package uses Laravel Prompts which supports macOS, Linux, and Windows with WSL. Due to limitations in the Windows version of PHP, it is not currently possible to use Laravel Prompts on Windows outside of WSL. For this reason, Laravel Prompts supports falling back to an alternative implementation such as the Symfony Console Question Helper.

You'll need a few things to get started: 

  • A Laravel app running version 10 or greater
  • PHP 8.1 or greater
  • An OpenAI developer account
  • A Pinecone developer account

If you already have a Laravel 10+ app running, great. If you don't, you can follow the instructions here.

OpenAI

Laragenie uses OpenAI for embeddings and AI responses.

If you don't have an OpenAI account, create one and generate an API key. You'll also need to add some credits to use the API; $5.00–10.00 will be enough to last a short while and should give you around 500–1000 AI responses. 

If you still think that's expensive, think of the time, cost and effort it would take to ask that many questions in forums, to co-workers etc.

Once you've completed your OpenAI setup, add the following to your .env file.

OPENAI_API_KEY=your-open-ai-key

Pinecone

Laragenie uses Pinecone to store vectors from your indexed files.

The easiest way to start is with a free account. Create an environment with 1536 dimensions and name it whatever you'd like. Go ahead and generate an API key and add these details to your .env file:

PINECONE_API_KEY=your-pinecone-api-key
PINECONE_ENVIRONMENT=gcp-starter
PINECONE_INDEX=your-index

Installation in your Laravel app

Install the Laragenie package via Composer:

composer require joshembling/laragenie

You can publish and run the migrations with:

php artisan vendor:publish --tag="laragenie-migrations"
php artisan migrate

This will add a table to your database to save your AI responses.   You can publish the config file with:

php artisan vendor:publish --tag="laragenie-config"

Your published config will look something like this:

return [
    'bot' => [
        'name' => 'Laragenie', // The name of your chatbot
        'welcome' => 'Hello, I am Laragenie, how may I assist you today?', // Your welcome message
        'instructions' => 'Write in markdown format. Try to only use factual data that can be pulled from indexed chunks.', // The chatbot instructions
    ],

    'chunks' => [
        'size' => 1000, // Maximum number of caracters to separate chunks
    ],

    'database' => [
        'fetch' => true, // Fetch saved answers from previous questions
        'save' => true, // Save answers to the database
    ],

    'extensions' => [ // The file types you want to index
        'php',
        'blade.php',
        'js',
    ],

    'indexes' => [
        'directories' => [], // The directores you want to index e.g. ['App/Models', 'App/Http/Controllers', '../frontend/src']
        'files' => [], // The files you want to index e.g. ['tests/Feature/MyTest.php']
        'removal' => [
            'strict' => true, // User prompt on deletion requests of indexes
        ],
    ],

    'openai' => [
        'embedding' => [
            'model' => 'text-embedding-ada-002', // Text embedding model 
            'max_tokens' => 5, // Maximum tokens to use when embedding
        ],
        'chat' => [
            'model' => 'gpt-4-1106-preview', // Your OpenAI GPT model
            'temperature' => 0.1, // Set temperature on the model
        ],
    ],

    'pinecone' => [
        'topK' => 2, // Pinecone indexes to fetch
    ],
];

Once you've completed the steps above you're ready to start using Laragenie. Open a terminal and run the following command from the root directory of your Laravel app:

php artisan laragenie

You will be greeted by Laragenie and see four options:

  1. Ask a question
  2. Index files
  3. Remove indexed files
  4. Something else

Use the arrow keys to toggle through the options and enter to select the command.


Ask a question

Ask a question

Note: you can only run this action once you have files indexed in your Pinecone vector database (skip to the 'Index Files' section if you wish to find out how to start indexing).

When your vector database has indexes you'll be able to ask any questions relating to your codebase. 

Answers can be generated in markdown format with code examples, or any format of your choosing. Use the bot.instructions config to write AI instructions as detailed as you need to.

Beneath each response you will see the generated cost (in US dollars), which will help keep close track of the expense. Cost of the response is added to your database, if migrations are enabled.

Costs can vary, but small responses will be less than $0.01. Much larger responses can be between $0.02–0.05.

Force AI

As previously mentioned, when you have migrations enabled your questions will save to your database. 

However, you may want to force AI usage (prevent fetching from the database) if you are unsatisfied with the initial answer. This will overwrite the answer already saved to the database.

To force an AI response, you will need to end all questions with an --ai flag e.g.

Tell me how users are saved to the database --ai

This will ensure the AI model will re-assess your request, and outputs another answer (this could be the same answer depending on the GPT model you are using).

Index files

Index files

The quickest way to index files is to pass in singular values to the directories and/or files array in the Laragenie config. When you run the 'Index Files' command you will always have the option to index/reindex these files. This will help in keeping your Laragenie up to date.

Select 'yes', when prompted with Do you want to index your directories and files saved in your config?

'indexes' => [
    'directories' => ['App/Models', 'App/Http/Controllers'],
    'files' => ['tests/Feature/MyTest.php'],
    'removal' => [
        'strict' => true,
    ],
],

If you select 'no', you can also index files in the following ways:

  • Input a file name with its namespace e.g. App/Models/User.php
  • Input a full directory, e.g. Services -- If you pass in a directory, Laragenie can only index files within this directory, and not its subdirectories. -- To index subdirectories you must explicitly pass the path e.g. App/Models to index all of your models
  • Input multiple files or directories in a comma separated list e.g. App/Models, tests/Feature, App/Http/Controllers/Controller.php
  • Input multiple directories with wildcards e.g. App/Models/*.php -- Please note that the wildcards must still match the file extensions in your Laragenie config file.

Indexing files outside of your Laravel project

You may use Laragenie in any way that you wish; you are not limited to indexing Laravel files based in this directory.

For example, your Laravel project may live in a monorepo with two root entries such as frontend and backend. In this instance, you could move up one level to index more directories and files e.g. ../frontend/src/ or ../frontend/components/Component.js.

You can add these to your directories and files in the Laragenie config:

'indexes' => [
    'directories' => ['App/Models', 'App/Http/Controllers', '../frontend/src/'],
    'files' => ['tests/Feature/MyTest.php', '../frontend/components/Component.js'],
    'removal' => [
        'strict' => true,
    ],
],

Using this same method, you could technically index any files or directories you have access to on your server or local machine.   Ensure your extensions in your Laragenie config match all the file types that you want to index.

'extensions' => [
    'php', 'blade.php', 'js', 'jsx', 'ts', 'tsx', // etc...
],

Note: if your directories, paths or file names change, Laragenie will not be able to find the index if you decide to update/remove it later on (unless you truncate your entire vector database, or go into Pinecone and delete them manually).

Removing indexed files

Removing indexed files

You can remove indexed files using the same methods listed above, except from using your directories or files array in the Laragenie config - this is currently for indexing purposes only.

If you want to remove all files you may do so by selecting Remove all chunked data. Be warned that this will truncate your entire vector database and cannot be reversed.

Truncate

To remove a comma separated list of files/directories, select the Remove data associated with a directory or specific file prompt as an option.

Strict removal, i.e. warning messages before files are removed, can be turned on/off by changing the 'strict' attribute in your config.

'indexes' => [
    'removal' => [
        'strict' => true,
    ],
],

Stopping Laragenie

You can stop Laragenie using the following methods:

  • ctrl + c (Linux/Mac)
  • Selecting No thanks, goodbye in the user menu after at least 1 prompt has run.

It's as simple as that! Accelerate your workflow instantly and collaborate seamlessly with the quickest and most knowledgeable 'colleague' you've ever had.

Have fun using Laragenie! 🤖

👉 You can follow me on Github here: @joshembling

⭐ If you've enjoyed this article, please star and share the Laragenie project: https://github.com/joshembling/laragenie