Drupal AI Code Generation

Jay Callicott
4 min readMay 11, 2024

--

Today, I’m thrilled to unveil a game-changing tool I’ve been diligently crafting over the past week. This tool harnesses the power of AI models to effortlessly generate complete Drupal module code.

GitHub Links

How it Works

The premise is elegantly simple — we’re taking a cue from existing code generation tools like Drush’s generate command. But instead of just boilerplate code snippets, why not let AI do the heavy lifting and conjure up an entire Drupal module? That's precisely what this tool accomplishes, with minimal configuration fuss.

Getting Started

To kick things off, you’ll want to configure the prompt template. Don’t fret; the module comes pre-loaded with a default template that you can tweak to your heart’s content. Next up, you’ll need to supply API keys for the AI models you wish to utilize. Fortunately, many of these APIs offer free or affordable access.

The Models

I’ve integrated four stellar models into this tool: ChatGPT, Gemini, Claude 3, and Llama 3 (assuming you’re running it locally using Ollama). Integrating these models with Drush is a breeze, owing to their similar APIs.

Understanding Llama 3

A quick word about Llama 3 — the catch is that most standard laptops can only handle the 8B version, which isn’t quite as adept as its larger counterparts. It’s fantastic for testing purposes, but expect a few more hiccups compared to the others. If you’ve got a robust gaming rig, you might be able to rock the 70B model, though I can’t guarantee it.

Screenshot of Meta Llama 3

Configuration Demystified

Let’s delve into module configuration. Here’s where you’ll input your API keys and tweak the default prompt template. Notice the placeholders for the module name and instructions — these will be filled in by you at the command line. Also, we’re requesting the models to return data in structured XML format for consistency.

DrupalAI configuration screen

Under the Hood

At its core, the module operates on a straightforward principle. We feed a prompt to a model, parse the XML response to extract filenames and content suggestions, and then write those to files in the module directory.

Peeking into the Code

Want to see what’s happening under the hood? Take a gander at the code. It’s not overly complex. We’re simply passing the prompt to an API endpoint, receiving back XML data, and translating that into module files. Currently, we’re only providing one user input, but future iterations may track conversation history for improved interactions.

...

$this->contents[] = [
"role" => "user",
"content" => $prompt,
];

try {
$response = $client->request('POST', $url, [
'headers' => [
'Content-Type' => 'application/json',
'Authorization' => 'Bearer ' . $api_key,
],
'json' => [
"model" => "gpt-4-turbo",
"messages" => $this->contents,
"temperature" => 1,
"max_tokens" => 4096,
"top_p" => 1,
"frequency_penalty" => 0,
"presence_penalty" => 0,
],
]);
}
catch (RequestException $e) {
\Drupal::logger('drupalai')->error($e->getMessage());
return FALSE;
}

if ($response->getStatusCode() != 200) {
\Drupal::logger('drupalai')->error($response->getBody()->getContents());
return FALSE;
}
else {
$data = $response->getBody()->getContents();
$content = json_decode($data)->choices[0]->message->content;

$this->contents[] = [
"role" => "assistant",
"content" => $content,
];

return $content;
}

...

Command Line Inputs

Using the module via the command line is a breeze. Select your desired model, provide the module name and instructions, and voila! You’ve got yourself a module. You even have the option to attempt module enablement right then and there.

Example using the new Drush AI command

Future Enhancements

Looking ahead, there are several improvements I’m eager to implement. Chief among them is support for multiple round trips to the same model, crucial for cases where token limits hinder complete responses. Additionally, empowering the tool to accept feedback and refine its module generation process could be a game-changer.

Final Thoughts

In closing, I must stress that this tool is an experiment — a foray into uncharted territory. While the generated code may not always be production-ready, remember that AI models are evolving rapidly. I recommend testing your prompts in chat clients before using the Drush command for optimal results.

As always, your feedback is invaluable. I’ve made all code publicly available, so dive in and tinker to your heart’s content. Until next time — happy coding!

Feedback and comments are more than welcome!

--

--

Jay Callicott

Drupal architect, open-source advocate, entrepreneur. Husband and father. I never take myself too seriously.