why doesn't response_mime_type = json work here?

class init:

 

 

self.generation_config = {
"temperature": 1,
"top_p": 0.95,
"top_k": 0,
"max_output_tokens": 8192,
}

 

 



def process_text(self, model_name, safety_settings, generation_config, text, system_instruction, user_prompt):

self.configure_api()
try:
model = genai.GenerativeModel(model_name, safety_settings, generation_config)
except Exception as e:
logging.error(f"error creating model {e}")
print(f"error creating model {e}")
prompt = [system_instruction, user_prompt, text]

try:
response = model.generate_content(prompt, generation_config={'response_mime_type':'application/json'})
except Exception as e:
logging.error(f"error generating content {e}")
print(f"error generating content {e}")
print(response)

more code:

def create_model(self, model_name, safety_settings, generation_config, system_instruction):
model = genai.GenerativeModel(model_name, safety_settings, generation_config, system_instruction)
return model



Traceback (most recent call last):
File "/Users/fred/bin/nimble/bookpublisherGPT/classes/SyntheticReaders/gemini2syntheticreaders/text2gemini.py", line 91, in <module>
t2g.process_text(t2g.model_name, t2g.safety_settings, t2g.generation_config, text, t2g.system_instruction,
File "/Users/fred/bin/nimble/bookpublisherGPT/classes/SyntheticReaders/gemini2syntheticreaders/text2gemini.py", line 72, in process_text
print(response)
UnboundLocalError: local variable 'response' referenced before assignme

Solved Solved
0 3 74
1 ACCEPTED SOLUTION

Dario set me on the right path.  I was using version 0.52 of google-generativeai. When I did pip install -U google-generativeai to version 0.62, the problem went away.

View solution in original post

3 REPLIES 3

two things,

first, you don't need to pass a `generation_config` object to both the constructor and the `generate_content` method. You can pass it either the constructor or the method.

for example:

```

generation_config = {
"temperature": 1,
"max_output_tokens": 2048,
"response_mime_type": "application/json"
}
safety_settings = {
genai.HarmCategory.HARM_CATEGORY_DANGEROUS_CONTENT: genai.HarmBlockThreshold.BLOCK_ONLY_HIGH,
genai.HarmCategory.HARM_CATEGORY_HATE_SPEECH: genai.HarmBlockThreshold.BLOCK_ONLY_HIGH,
genai.HarmCategory.HARM_CATEGORY_HARASSMENT: genai.HarmBlockThreshold.BLOCK_ONLY_HIGH,
genai.HarmCategory.HARM_CATEGORY_SEXUALLY_EXPLICIT: genai.HarmBlockThreshold.BLOCK_ONLY_HIGH,
}

try:
model = genai.GenerativeModel(model_name, generation_config=generation_config, safety_settings=safety_settings)
except Exception as e:
print(f"error creating model {e}")
print(f"error creating model {e}")
prompt = ["what is the capital of south africa? reply in json format for example {'country': 'italy', 'capital': 'Rome'}"]

try:
response = model.generate_content(prompt)
except Exception as e:
print(f"error generating content {e}")
print(f"error generating content {e}")
print(response)

Second, what model are you using, because "gemini-1.0-pro-vision-001" do not support "response_mime_type": "application/json"

but "gemini-1.5-pro-preview-0409" does. 

Also your Traceback is misleading as you are probably not including some print from the exception being captured. I would remove the try except blocks to understand what the errors are. 

If you find this answer your question please do not forget to accept it.

Thanks Dario.

Dario set me on the right path.  I was using version 0.52 of google-generativeai. When I did pip install -U google-generativeai to version 0.62, the problem went away.