Saturday, June 29, 2024

Use AI Model as Compiler - Install Meta LLM Compiler 13B Locally

 This video locally installs Meta LLM Compiler 13B and shows a step-by-step tutorial to optimize Python code into LLVM-IR Code.


Code:



=======================================
Python Function:
=======================================

def add_two(a, b):
    return a + b

=======================================
Convert Python code to LLVM-IR Code:
=======================================

pip install llvmlite

from llvmlite import ir, binding

# Define the function type
func_type = ir.FunctionType(ir.IntType(32), [ir.IntType(32), ir.IntType(32)])

# Create a module
module = ir.Module(name="my_module")
module.triple = binding.get_default_triple()

# Create the function and add it to the module
func = ir.Function(module, func_type, name="add_two")

# Define the entry block
entry_block = func.append_basic_block(name="entry")
builder = ir.IRBuilder(entry_block)

# Define function arguments
a, b = func.args
a.name = "a"
b.name = "b"

# Add the instruction to add the two arguments
result = builder.add(a, b, name="res")

# Return the result
builder.ret(result)

# Print the LLVM-IR
print(module)

=======================================
Ask LLM Compiler to Optimize it:
=======================================

Optimize the following LLVM-IR code using opt -O3:

; ModuleID = 'my_module'
target triple = "x86_64-unknown-linux-gnu"
define i32 @add_two(i32 %a, i32 %b) {
entry:
  %res = add i32 %a, %b
  ret i32 %res
}

No comments: