added

Extremely Large (Beta) Release

Our biggest and most performant generation model is now available. Extremely Large (Beta) outperforms our previous large model on a variety of downstream tasks including sentiment analysis, named entity recognition (NER) and common sense reasoning, as measured by our internal benchmarks. You can access Extremely Large (Beta) as xlarge-20220301. While in Beta, note that this model will have a maximum token length of 1024 tokens and maximum num_generations of 1.