You are truly a godsend!
#16
by
						
lcahill
	
							
						- opened
							
					
✔️ Tool usage with specific tool tokens and fine tuning
✔️ Real open license
✔️ Enhanced inference library
✔️ Safetensors from the start
Thank you so much for your invaluable contribution to the community!
A few questions:
- Is the 
mistral_inferencelibrary using constrained generation to ensure that tools are called with the correct syntax? - Does the model use its own 'judgment' to decide whether or not to use a given tool?
 - I am getting an error in this line from the readme. Is this a versioning issue? This follows a fresh 
pip install mistral_inferenceon windows:Error:from mistral_inference.model import TransformerModuleNotFoundError: No module named 'mistral_inference.model' 
lcahill
	
				
		changed discussion status to
		closed