Hello World

Be Happy!

Ollama with curl


1. Response has a single file only
curl http://localhost:11434/api/generate -d '{ 
  "model": "qwen3:4b", 
  "prompt": "coding simple chat server with ruby. Response using JSON", "stream": false,        
  "format": { 
    "type": "object", 
    "properties": { 
      "filename": { 
        "type": "string" 
      }, 
      "content": { 
        "type": "string" 
      } 
    }, 
    "required": [ 
      "filename", 
      "content" 
    ] 
  } 
}' | jq 

2. Response can has mulitple files - qwen3:4b was not working, need more llama3:8b
# (  "prompt": "Respond ONLY with JSON. No explanations. Return exactly two files in a \"files\" array. Each object has \"filename\" and \"content\".\n\n1. server.rb: Ruby TCP server on port 8080. Uses Socket. Echoes messages.\n2. client.rb: Ruby TCP client. Connects to localhost:8080. Sends user input.\n\nExample: {\"files\":[{\"filename\":\"server.rb\",\"content\":\"require...\"},{\"filename\":\"client.rb\",\"content\":\"require...\"}]}",)
curl http://localhost:11434/api/generate -d '{
  "model": "llama3:8b",
  "prompt": "coding simple TCP socket chatting server, client with ruby",
  "stream": false,
  "format": {
    "type": "object",
    "properties": {
      "files": {
        "type": "array",
        "items": {
          "type": "object",
          "properties": {
            "filename": { "type": "string" },
            "content": { "type": "string" }
          },
          "required": ["filename", "content"]
        }
      }
    },
    "required": ["files"]
  }
}' | jq
https://github.com/ollama/ollama/blob/main/docs/api.md#generate-a-completion
#ollma (1) #ml (5) #ai (9)
List