DGPT Profile Banner
DGPT Profile
DGPT

@DGPTeth

684
Followers
37
Following
18
Media
33
Statuses

DGPT - Decentralized Generative Pre-Trained Transformers CA 0x9b06203B932e3a1693bDf888EcCBe5c39406FEd4 TG

Joined June 2024
Don't wanna be here? Send us removal request.
@DGPTeth
DGPT
4 months
Check out for an in depth description of the $DGPT project and our website #DGPT
6
3
22
@DGPTeth
DGPT
4 months
$DGPT successfully launched today with over $200k volume in the past hours to start off Decentralized Generative Pre-Trained Transformers! With liquidity locked, contract renounced, and taxes lowered to 3/3, Decentralized GPT is advancing the nexus of innovation in crypto and AI
Tweet media one
3
3
18
@DGPTeth
DGPT
4 months
DSFT: Decentralized Supervised Fine-Tuning After training the model with the objective in the DUPT likelihood function we adapt the parameters to the supervised target task. Oracles with labeled datasets have instances consisting of a sequence of input tokens along with a label.
Tweet media one
2
5
18
@DGPTeth
DGPT
4 months
DGPT-1 has the goal of placing on the Elo Rating Leaderboard of the @UCBerkeley LMSYS Chatbot Arena to participate in a crowdsourced open platform for the evaluation of LLMs. The next phase of the project is to begin training the Decentralized LLM of the future. @DGPTeth $DGPT
1
5
18
@DGPTeth
DGPT
4 months
$DGPT Contract Renounced #DGPT
Tweet media one
0
3
16
@DGPTeth
DGPT
4 months
DUPT: Use a multi-layer Transformer decoder for the language model and apply a multi-headed self-attention operation over the input context tokens followed by position-wise feed forward layers to produce an output distribution using the token and position embedding matrices.
Tweet media one
2
5
16
@DGPTeth
DGPT
4 months
LP Locked for DGPT/ETH @Uniswap 🔐 $DGPT #DGPT
Tweet media one
0
3
16
@DGPTeth
DGPT
4 months
DUPT: Decentralized Unsupervised Pre-Training An oracle with a corpus of tokens using a standard language modeling objective maximizes a likelihood function. Conditional probability is modeled using neural network parameters which are trained using a stochastic gradient descent.
Tweet media one
2
5
16
@DGPTeth
DGPT
4 months
@punk6529 Now imagine the Decentralized Unsupervised Pre-Training then Decentralized Supervised Fine-Tuning required for Decentralized Generative Pre-Trained Transformers to make the Open Source LLM for the future where proprietary models and centralized training methodogies cannot scale.
0
6
16
@DGPTeth
DGPT
4 months
The time has come $DGPT #DGPT Watch for going live
Tweet media one
0
2
15
@DGPTeth
DGPT
4 months
$DGPT Decentralized Generative Pre-Trained Transformers #DGPT
Tweet media one
0
3
14
@DGPTeth
DGPT
4 months
👾
0
4
14
@DGPTeth
DGPT
4 months
$DGPT is coming soon... be ready for @DGPTeth to go live shortly today! #DGPT
Tweet media one
0
3
14
@DGPTeth
DGPT
3 months
🧠 $DGPT @DGPTeth IYKYK
Tweet media one
1
1
6
@DGPTeth
DGPT
3 months
DGPT is seeking AI experts to review the DGPT-1 architecture and collaborate toward building the decentralized model training of the future with @DGPTeth $DGPT
Tweet media one
0
1
5