OpenThaiGPT
Search
K
🔥

Released OpenThaiGPT 7b <1.0.0-beta> (16/08/23)

🇹🇭 OpenThaiGPT 1.0.0-beta (16 August 2023)
🇹🇭 OpenThaiGPT Version 1.0.0-beta is a Thai language 7B-parameter LLaMA v2 Chat model finetuned to follow Thai translated instructions and extend more than 24,554 most popular Thai words vocabularies into LLM's dictionary for turbo speed.

Web Demo:

Colab Demo:

Change Logs

🇹🇭 Version 1.0.0-beta (Llama v2 + 24,554 Thai word extension)

Release date: 16 August 2023
🇹🇭 OpenThaiGPT Version 1.0.0-beta is a Thai language 7B-parameter LLaMA v2 Chat model finetuned to follow Thai translated instructions and extend 24,554 Thai words vocabularies for turbo speed.

License

Source Code: License Apache Software License 2.0. Weight: Research and Commercial uses.

Code and Weight

Authors

---

🇹🇭 Version 1.0.0-alpha (Facebook LLama V2 Model)

Release date: 3 August 2023
🇹🇭 OpenThaiGPT Version 1.0.0-alpha is the first Thai implementation of a 7B-parameter LLaMA v2 Chat model finetuned to follow Thai translated instructions and makes use of the Huggingface LLaMA implementation.

Changes

(1) Using Facebook LLama v2 model 7b chat as a base model which is pretrained on over 2 trillion token. (2) Context Length is upgrade from 2048 token to 4096 token (3) Allow research and commerical use.

License

Source Code: License Apache Software License 2.0. Weight: Research and commercial uses.

Code and Weight

Authors

---

Version 0.1.0-beta (Facebook LLama Model)

Release date: 16 May 2023
OpenThaiGPT Version 0.1.0-beta is a 7B-parameter LLaMA model finetuned to follow Thai translated instructions below and makes use of the Huggingface LLaMA implementation.

Statistics

Number of parameters: 7B Dimension: 4096 Context Length: 2048 n heads: 32 n layers: 32 n tokens: 1T

License

Source Code: License Apache Software License 2.0. Weight: For research use only (due to the Facebook LLama's Weight LICENSE). Note that: A commercial use license for OpenThaiGPT 0.1.0 weight will be released later soon!

Code and Weight

Authors

Kobkrit Viriyayudhakorn ([email protected]), Sumeth Yuenyong ([email protected]) and Thaweewat Rugsujarit ([email protected]).

Trained Datasets

Dataset Name
Instruction Pairs
Descriptions
43,000
Alpaca Finance Instruction translated into Thai by Thaweewat Ruksujarit.
600
RD's Tax QA Chatbot Training set by ทรงวุฒิ บุรงค์
4,000
iApp Technology's Extractive QA Dataset in Thai language
15,000
Databrick's Dolly Instruction translated into Thai by Thaweewat Ruksujarit.
52,000
Instruction Wild's translated into Thai by Thaweewat Ruksujarit.
51,000
Standford Alpaca's translated into Thai by Thaweewat Ruksujarit.
20,000
GPT Teacher's Instruction translated into Thai by Thaweewat Ruksujarit.
600
ONET m6 Social Exam
24,000
Hello Simple AI Summary Dataset translated into Thai by Thaweewat Ruksujarit.
5,000
Thai SelfInstruct Dataset (Automatic Generated) by OpenThaiGPT
---

Version 0.1.0-alpha (ByT5-XL Model)

OpenThaiGPT version 0.1.0-alpha
Thai First 3 billion params models
  • First Thai Byte-Level Text-to-Text Transfer Transformer
  • Support Instruction following
    • Translation to Thai
    • Explanation
    • Paraphase
  • Zero-shot and Few-shot Learning
  • Pretraining Model: ByT5-XL (3.74 billion params)
  • InstructDataset: 50,000 Thai SelfInstruct
  • RLHF: None
  • Developer: Sumeth Yuenyong, Kobkrit Viriyayudhakorn ([email protected])

PoC Version 0.0.4 (The Fourth PoC Version)

OpenThaiGPT version 0.0.4
The Fourth PoC Model
  • ตอบคำถามได้ลงรายละเอียดมากขึ้น และตอบคำถามได้ดีขึ้นกว่า 0.0.3 เป็นส่วนมาก
  • Pretraining Model: GPT-2 Thai-base
  • InstructDataset: 300,000 Pantip + 5,000 Wiki QA => 12,920 Thai InstructGPT
  • RLHF: None
  • Developer: Kobkrit Viriyayudhakorn ([email protected])

PoC Version 0.0.3 (The Third PoC Version)

OpenThaiGPT version 0.0.3
The Third PoC Model
  • Pretraining Model: GPT-2 Thai-base
  • InstructDataset: 300,000 Pantip + 5,000 Wiki QA => 7,000 Thai InstructGPT
  • RLHF: None
  • Developer: Kobkrit Viriyayudhakorn ([email protected])

PoC Version 0.0.2 (The Second PoC Version)

Release date: 27 February 2023 Model and Weight: https://huggingface.co/kobkrit/openthaigpt-gpt2-instructgpt-poc-0.0.2 PIP Installation Page: {Coming Soon} Colab Example: {Coming Soon} ----
OpenThaiGPT version 0.0.2
The Second PoC Model
  • Pretraining Model: GPT-2 Thai-base
  • InstructDataset: 7,000 Thai InstructGPT
  • RLHF: None
Developer: Kobkrit Viriyayudhakorn ([email protected])

PoC Version 0.0.1 (Very First PoC Version)

Release date: 20 February 2023 Model and Weight: openthaigpt-gpt2-pantipwiki-poc PIP Installation Page: {Coming Soon} Colab Example: {Coming Soon} ----
The Very First PoC Model
  • Pretraining Model: GPT-2 Thai-base
  • InstructDataset: 298,678 QA Pairs getting from 70,000 Pantip katoos + Wikipedia QA by iApp
  • RLHF: None
  • Developer: Kobkrit Viriyayudhakorn ([email protected])