Yahoo Romania Căutare pe Web

Search results

  1. 🤗 Transformers State-of-the-art Machine Learning for PyTorch , TensorFlow , and JAX . Using pretrained models can reduce your compute costs, carbon footprint, and save you the time and resources required to train a model from scratch.

    • Time Series Transformer

      A transformers.modeling_outputs.Seq2SeqTSModelOutput or a...

    • Bert

      Overview. The BERT model was proposed in BERT: Pre-training...

    • Tokenizer

      Parameters . new_tokens (str, tokenizers.AddedToken or a...

    • Train With a Script

      The example script downloads and preprocesses a dataset from...

    • Trainer

      Trainer is a simple but feature-complete training and eval...

    • Training on One GPU

      For an example of using torch.compile with 🤗 Transformers,...

    • Pipelines

      Pipelines. The pipelines are a great and easy way to use...

    • Installation

      Install 🤗 Transformers for whichever deep learning library...

  2. GPT-2 is a large transformer-based language model with 1.5 billion parameters, trained on a dataset[1] of 8 million web pages. GPT-2 is trained with a simple objective: predict the next word, given all of the previous words within some text.

  3. Train state-of-the-art models in 3 lines of code. Deep interoperability between TensorFlow 2.0 and PyTorch models. Move a single model between TF2.0/PyTorch frameworks at will. Seamlessly pick the right framework for training, evaluation, production.

  4. State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow. 🤗 Transformers provides thousands of pretrained models to perform tasks on different modalities such as text, vision, and audio. These models can be applied on: 📝 Text, for tasks like text classification, information extraction, question answering, summarization ...

  5. TLDR; Phi-3 introduces new ROPE scaling methods, which seems to scale fairly well! A 3b and a Phi-3-mini is available in two context-length variants—4K and 128K tokens. It is the first model in its class to support a context window of up to 128K tokens, with little impact on quality. Phi-3 by @gugarosa in #30423; JetMoE

  6. www.hugging-face.org › hugging-face-transformersHugging Face Transformers

    20 nov. 2023 · Hugging Face Transformers offers cutting-edge machine learning tools for PyTorch, TensorFlow, and JAX. This platform provides easy-to-use APIs and tools for downloading and training top-tier pretrained models.

  7. 🤗 transformers is a library maintained by Hugging Face and the community, for state-of-the-art Machine Learning for Pytorch, TensorFlow and JAX. It provides thousands of pretrained models to perform tasks on different modalities such as text, vision, and audio. We are a bit biased, but we really like 🤗 transformers!

  1. De asemenea, lumea caută