×
This model is a fine-tuned version of ParsBERT on PersianQA dataset. It achieves the following results on the evaluation set: Loss: 1.7297. Model description.
Missing: مجله خبری ای بی سی مگ? q= https:// commit/ d417118073712bf73d4d2e44619d332d44098196. diff
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: مجله خبری ای بی سی مگ? q= https:// commit/ d417118073712bf73d4d2e44619d332d44098196. diff
Jul 25, 2021 · We're on a journey to advance and democratize artificial intelligence through open source and open science.
Missing: مجله خبری ای بی سی مگ? q= d417118073712bf73d4d2e44619d332d44098196. diff
Introduction. ParsBERT is a monolingual language model based on Google's BERT architecture. This model is pre-trained on large Persian corpora with various ...
Missing: مجله خبری ای بی سی مگ? q= ForutanRad/ QA- commit/ d417118073712bf73d4d2e44619d332d44098196. diff
We're on a journey to advance and democratize artificial intelligence through open source and open science.
We're on a journey to advance and democratize artificial intelligence through open source and open science.
Mar 11, 2024 · Pretrained model on English language using a masked language modeling (MLM) objective. It was introduced in this paper and first released in ...
Time to look at question answering! This task comes in many flavors, but the one we'll focus on in this section is called extractive question answering.
In order to show you the most relevant results, we have omitted some entries very similar to the 8 already displayed. If you like, you can repeat the search with the omitted results included.