-
Notifications
You must be signed in to change notification settings - Fork 35
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #154 from stanford-crfm/jonathan/0216-weekly-assets
add form response assets + add weekly assets
- Loading branch information
Showing
39 changed files
with
1,080 additions
and
12 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -22,3 +22,25 @@ | |
prohibited_uses: '' | ||
monitoring: none | ||
feedback: Feedback can be sent to authors via [email protected] | ||
- type: model | ||
name: MiniMA | ||
organization: Beijing Institute of Technology | ||
description: MiniMA is a smaller finetuned Llama 2 model adapted for Chinese. | ||
created_date: 2023-11-13 | ||
url: https://github.com/GeneZC/MiniMA | ||
model_card: https://huggingface.co/GeneZC/MiniMA-3B | ||
modality: text; text | ||
analysis: Evaluated on standard benchmarks including MMLU, CEval, and DROP. | ||
size: 3B parameters (dense) | ||
dependencies: [Llama 2] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: 8 A100 80G GPUs | ||
quality_control: '' | ||
access: open | ||
license: Llama 2 | ||
intended_uses: '' | ||
prohibited_uses: '' | ||
monitoring: unknokwn | ||
feedback: https://huggingface.co/GeneZC/MiniMA-3B/discussions | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,24 @@ | ||
--- | ||
- type: model | ||
name: CausalLM | ||
organization: CausalLM | ||
description: CausalLM is an LLM based on the model weights of Qwen and trained on a model architecture identical to LLaMA 2. | ||
created_date: 2023-10-21 | ||
url: https://huggingface.co/CausalLM/14B | ||
model_card: https://huggingface.co/CausalLM/14B | ||
modality: text; text | ||
analysis: Evaluated on standard benchmarks across a range of tasks. | ||
size: 14B parameters (dense) | ||
dependencies: [Qwen, OpenOrca, Open Platypus] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: unknown | ||
quality_control: '' | ||
access: open | ||
license: | ||
explanation: can be found at https://github.com/rpherrera/WTFPL (HuggingFace lists this to be the license) | ||
value: WTFPL | ||
intended_uses: '' | ||
prohibited_uses: '' | ||
monitoring: unknown | ||
feedback: none |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,48 @@ | ||
--- | ||
- type: model | ||
name: Dolphin 2.2 Yi | ||
organization: Cognitive Computations | ||
description: Dolphin 2.2 Yi is an LLM based off Yi. | ||
created_date: 2023-11-14 | ||
url: https://erichartford.com/dolphin | ||
model_card: https://huggingface.co/cognitivecomputations/dolphin-2_2-yi-34b | ||
modality: text; text | ||
analysis: none | ||
size: 34B parameters (dense) | ||
dependencies: [Dolphin, Yi] | ||
training_emissions: unknown | ||
training_time: 3 days | ||
training_hardware: 4 A100 GPUs | ||
quality_control: '' | ||
access: open | ||
license: | ||
explanation: can be found at https://huggingface.co/cognitivecomputations/dolphin-2_2-yi-34b/blob/main/LICENSE | ||
value: custom | ||
intended_uses: '' | ||
prohibited_uses: '' | ||
monitoring: unknown | ||
feedback: https://huggingface.co/cognitivecomputations/dolphin-2_2-yi-34b/discussions | ||
- type: model | ||
name: WizardLM Uncensored | ||
organization: Cognitive Computations | ||
description: WizardLM Uncensored is WizardLM trained with a subset of the dataset - responses that contained alignment / moralizing were removed. | ||
created_date: | ||
explanation: release date is not published; estimated to be sometime in either May or June 2023. | ||
value: 2023-06-01 | ||
url: https://huggingface.co/cognitivecomputations/WizardLM-30B-Uncensored | ||
model_card: https://huggingface.co/cognitivecomputations/WizardLM-30B-Uncensored | ||
modality: text; text | ||
analysis: Evaluated on OpenLLM leaderboard. | ||
size: 30B parameters (dense) | ||
dependencies: [WizardLM] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: unknown | ||
quality_control: '' | ||
access: open | ||
license: unknown | ||
intended_uses: '' | ||
prohibited_uses: '' | ||
monitoring: unknown | ||
feedback: https://huggingface.co/cognitivecomputations/WizardLM-30B-Uncensored/discussions | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,22 @@ | ||
--- | ||
- type: model | ||
name: SaiLY | ||
organization: Deepnight Research | ||
description: SaiLy is a series/collection of AI Models by Deepnight Research which are highly experimental and uncensored. | ||
created_date: 2023-11-04 | ||
url: https://huggingface.co/deepnight-research/saily_100b | ||
model_card: https://huggingface.co/deepnight-research/saily_100b | ||
modality: text; text | ||
analysis: none | ||
size: 100B parameters (dense) | ||
dependencies: [] | ||
training_emissions: unknown | ||
training_time: unknown | ||
training_hardware: unknown | ||
quality_control: '' | ||
access: open | ||
license: MIT | ||
intended_uses: '' | ||
prohibited_uses: '' | ||
monitoring: unknown | ||
feedback: https://huggingface.co/deepnight-research/saily_100b/discussions |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.