# Generation Parameters

{% hint style="danger" %}
These docs are outdated! Please check out <https://docs.titanml.co> for the latest information on the TitanML platform.\
\
If there's anything that's not covered there, please contact us on our [discord](https://discord.com/invite/83RmHTjZgf).
{% endhint %}

The API supports the standard generation parameters. See below for a description.

To use the parameters include them in the json payload:

```
import requests

if __name__ == "__main__":
    
    input_text = 'List 3 things to do in London.'

    url = "http://localhost:8000/generate_stream"
    json = {
        "text":input_text,
        "sampling_temperature":0.1,
        "no_repeat_ngram_size":3
        }
     
    response = requests.post(url, json=json, stream=True)
    response.encoding = 'utf-8'
     
    for text in response.iter_content(chunk_size=1, decode_unicode=True):
        if text:
            print(text, end="", flush=True)
```

| Parameter Name              | Description                                                                                       | Default Value        |
| --------------------------- | ------------------------------------------------------------------------------------------------- | -------------------- |
| **generate\_max\_length**   | The maximum generation length                                                                     | 128                  |
| **sampling\_topk**          | Sample predictions from the top K most probable candidates                                        | 1                    |
| **sampling\_topp**          | Sample from predictions who's cumulative probability exceeds this value                           | 1.0 (no restriction) |
| **sampling\_temperature**   | Sample with randomness. Bigger temperatures are associated with more randomness and 'creativity'. | 1.0                  |
| **repetition\_penalty**     | Penalise the generation of tokens that have been generated before. Set to > 1 to penalize.        | 1 (no penalty)       |
| **no\_repeat\_ngram\_size** | Prevent repetitions of ngrams of this size.                                                       | 0 (turned off)       |
