With all the buzz around GPT4 by OpenAI, I’ve been searching for the answer to my burning question, can generative AI help me code faster? Today the answer is Yes-ish.

Here is a short demo showing how I made a quick and dirty AI copilot to help me code using Google’s next generation large language model PaLM 2.

A demo of the AI copilot I am building with Google PaLM 2 API.

I am surprised how far I was able to get with the demo. The approach to making this type of product is to treat the AI as a magic black box that takes natural language input, and returns code output. From there you use meta-programming to parse the code, and execute it with the programs context. Here’s an example of the main function.

def main():
  query = input("\nEnter your query: \n")
  passage = find_best_passage(query, df)
  prompt = make_prompt(query, passage)
  answer = palm.generate_text(prompt=prompt,

  code = answer.candidates[0]['output'].strip("```python\n").strip("\n```")
  parsed = ast.parse(code)

  # Extract the function definition from the parsed code
  function_def = parsed.body[0]
  function_code = compile(ast.Module(body=[function_def], type_ignores=[]), '<ast>', 'exec')
  function_name = function_def.name

  # Create a global namespace to hold the function
  global_namespace = {}

  # Execute the function code in the global namespace
  exec(function_code, builtins.__dict__, global_namespace)

  # Call the function
  result = global_namespace[function_name]()

  # Print the result

The next step is to factor in long term memory to make sure the changes are saved as the current working context when talking with the AI.

Last updated: 2023-06-06

Join the Newsletter

Subscribe to get content by email which include complete code snippets.

    We won’t send you spam. Unsubscribe at any time.