DistilBERT base uncased finetuned SST-2
Table of Contents
Model Details
How to Get Started With the Model
Uses
Risks, Limitations and Biases
Training
Model Details
…
FinBERT is a pre-trained NLP model to analyze sentiment of financial text. It is built by further training the BERT language model in the finance domain, using a large financial…
The crispy rerank family from mixedbread ai.
mxbai-rerank-xsmall-v1
This is the smallest model in our family of powerful reranker models. You can learn more about the models in…
Reward Model Overview
The reward model is trained from the base model google/gemma-2b-it. See the 7B version RM-Gemma-7B.
Model Details
If you have any question…
Twitter-roBERTa-base for Sentiment Analysis - UPDATED (2022)
This is a RoBERTa-base model trained on ~124M tweets from January 2018 to December 2021, and finetuned for sentiment analysis with the…
BCEmbedding: Bilingual and Crosslingual Embedding for RAG
最新、最详细bce-reranker-base_v1相关信息,请移步(The latest "Updates" should be checked…
In Loving memory of Simon Mark Hughes...
Introduction
The HHEM model is an open source model, created by Vectara, for detecting hallucinations in LLMs. It is particularly useful…
DistilRoberta-financial-sentiment
This model is a fine-tuned version of distilroberta-base on the financial_phrasebank dataset.
It achieves the following results on the evaluation set:
Loss: 0.1116
Accuracy: 0.9823
Base Model description
…
