What is GPT-NeoX?

This repository presents an implementation of model parallel autoregressive transformers that harness the power of GPUs through the DeepSpeed library. It acts as a documentation of EleutherAI's framework aimed at training large language models specifically for GPU environments. At this time, it expands upon NVIDIA's Megatron Language Model, integrating sophisticated techniques from DeepSpeed along with various innovative optimizations. Our objective is to establish a centralized resource for compiling methodologies essential for training large-scale autoregressive language models, which will ultimately stimulate faster research and development in the expansive domain of large-scale training. By making these resources available, we aspire to make a substantial impact on the advancement of language model research while encouraging collaboration among researchers in the field.

Pricing

Price Starts At:
Free
Price Overview:
Open source
Free Version:
Free Version available.

Screenshots and Video

GPT-NeoX Screenshot 1

Company Facts

Company Name:
EleutherAI
Date Founded:
2020
Company Website:
github.com/EleutherAI/gpt-neox

Product Details

Deployment
SaaS
On-Prem
Training Options
Documentation Hub

Product Details

Target Company Sizes
Individual
1-10
11-50
51-200
201-500
501-1000
1001-5000
5001-10000
10001+
Target Organization Types
Mid Size Business
Small Business
Enterprise
Freelance
Nonprofit
Government
Startup
Supported Languages
English

GPT-NeoX Categories and Features