Lab Programs
Lab Programs
Parameters:
prompt (str): The input text to guide generation.
model_name (str): The GPT-2 model variant to use (e.g., 'gpt2', 'gpt2-
medium').
max_length (int): The maximum length of the generated text.
temperature (float): Sampling temperature for randomness (higher = more
random text).
top_k (int): The number of highest probability tokens to consider.
top_p (float): The cumulative probability for nucleus sampling.
Returns:
str: The generated text.
"""
# Load tokenizer and model
tokenizer = GPT2Tokenizer.from_pretrained(model_name)
model = GPT2LMHeadModel.from_pretrained(model_name)
# Generate text
with torch.no_grad():
output = model.generate(
input_ids,
max_length=max_length,
temperature=temperature,
top_k=top_k,
top_p=top_p,
do_sample=True # Enables sampling for diverse outputs
)
Image Synthesis using DALL-E: Dive into image generation with OpenAI’s DALL-E,
creating unique and imaginative visuals based on textual descriptions.
from diffusers import StableDiffusionPipeline
import torch
from PIL import Image
def create_simple_midi():
"""
Generates a simple AI-composed MIDI melody and saves it as
'generated_music.mid'.
"""
# Create a PrettyMIDI object
midi = pretty_midi.PrettyMIDI()
Code Generation with OpenAI Codex: Try your hand at code generation using
OpenAI Codex, which is proficient in understanding and generating
programming code
Code Generation with OpenAI Codex: Try your hand at code generation using OpenAI
Codex, which is proficient in understanding and generating programming
code. https://aistudio.google.com/prompts/new_chat (Use the link to generate API
from Google Gemini) package to be installed pip install google-generativeai
# Set your API key here (replace 'your_api_key_here' with your actual key)
os.environ['API_KEY'] = api # <-- Replace this with your actual API key
# Generate response
response = model.generate_content(user_prompt)
print("\nGenerated Response:\n")
print(response.text)