Document (#44312)

Author
Donath, A.
Title
LlaMA 3 ist da : Meta AI
Issue
[19.04.2024].
Year
2024
Abstract
Meta hat die nächste Generation seines Large Language Models (LLM), Llama 3, vorgestellt. Es soll leistungsfähiger sein als die meisten aktuellen KI-Modelle. Meta hat das Sprachmodell Llama 3 in einer Vorabversion vorgestellt und will es bald auch für Cloud-Anbieter wie AWS und für Modellbibliotheken wie Hugging Face freigegeben. Wer will, kann den dazugehörigen Chatbot unter meta.ai schon jetzt in einigen Ländern auszuprobieren, so das Unternehmen in einem Blogbeitrag. Llama 3 wurde auch in WhatsApp und Instagram für die Suche integriert.Llama 3 verfügt derzeit über zwei Modellgewichte mit 8B und 70B Parametern. Das B steht für Milliarden und gibt an, wie komplex ein Modell ist. Die umfangreichste Variante wurde mit Daten bis Dezember 2023 trainiert. Die kleinere Version kennt hingegen nur Daten bis März 2023. Meta trainiert gerade ein 400B-Modell.
Theme
Computerlinguistik
Field
Informatik
Object
Llama

Similar documents (content)

  1. Hahn, S.: DarkBERT ist mit Daten aus dem Darknet trainiert : ChatGPTs dunkler Bruder? (2023) 0.12
    0.122911215 = sum of:
      0.122911215 = product of:
        1.0242602 = sum of:
          0.12330554 = weight(abstract_txt:daten in 1981) [ClassicSimilarity], result of:
            0.12330554 = score(doc=1981,freq=1.0), product of:
              0.12523898 = queryWeight, product of:
                1.4117883 = boost
                5.250997 = idf(docFreq=632, maxDocs=44421)
                0.016893832 = queryNorm
              0.9845619 = fieldWeight in 1981, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                5.250997 = idf(docFreq=632, maxDocs=44421)
                0.1875 = fieldNorm(doc=1981)
          0.23420309 = weight(abstract_txt:modell in 1981) [ClassicSimilarity], result of:
            0.23420309 = score(doc=1981,freq=1.0), product of:
              0.19207886 = queryWeight, product of:
                1.7483952 = boost
                6.5029707 = idf(docFreq=180, maxDocs=44421)
                0.016893832 = queryNorm
              1.219307 = fieldWeight in 1981, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                6.5029707 = idf(docFreq=180, maxDocs=44421)
                0.1875 = fieldNorm(doc=1981)
          0.6667515 = weight(abstract_txt:trainiert in 1981) [ClassicSimilarity], result of:
            0.6667515 = score(doc=1981,freq=1.0), product of:
              0.38582805 = queryWeight, product of:
                2.477974 = boost
                9.216561 = idf(docFreq=11, maxDocs=44421)
                0.016893832 = queryNorm
              1.7281053 = fieldWeight in 1981, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                9.216561 = idf(docFreq=11, maxDocs=44421)
                0.1875 = fieldNorm(doc=1981)
        0.12 = coord(3/25)
    
  2. Luca, E.W. de; Dahlberg, I.: ¬Die Multilingual Lexical Linked Data Cloud : eine mögliche Zugangsoptimierung? (2014) 0.08
    0.075205564 = sum of:
      0.075205564 = product of:
        0.47003478 = sum of:
          0.08024919 = weight(abstract_txt:cloud in 2736) [ClassicSimilarity], result of:
            0.08024919 = score(doc=2736,freq=1.0), product of:
              0.13381632 = queryWeight, product of:
                1.0319041 = boost
                7.676116 = idf(docFreq=55, maxDocs=44421)
                0.016893832 = queryNorm
              0.5996966 = fieldWeight in 2736, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                7.676116 = idf(docFreq=55, maxDocs=44421)
                0.078125 = fieldNorm(doc=2736)
          0.102754615 = weight(abstract_txt:daten in 2736) [ClassicSimilarity], result of:
            0.102754615 = score(doc=2736,freq=4.0), product of:
              0.12523898 = queryWeight, product of:
                1.4117883 = boost
                5.250997 = idf(docFreq=632, maxDocs=44421)
                0.016893832 = queryNorm
              0.8204683 = fieldWeight in 2736, product of:
                2.0 = tf(freq=4.0), with freq of:
                  4.0 = termFreq=4.0
                5.250997 = idf(docFreq=632, maxDocs=44421)
                0.078125 = fieldNorm(doc=2736)
          0.09758463 = weight(abstract_txt:modell in 2736) [ClassicSimilarity], result of:
            0.09758463 = score(doc=2736,freq=1.0), product of:
              0.19207886 = queryWeight, product of:
                1.7483952 = boost
                6.5029707 = idf(docFreq=180, maxDocs=44421)
                0.016893832 = queryNorm
              0.5080446 = fieldWeight in 2736, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                6.5029707 = idf(docFreq=180, maxDocs=44421)
                0.078125 = fieldNorm(doc=2736)
          0.18944634 = weight(abstract_txt:meta in 2736) [ClassicSimilarity], result of:
            0.18944634 = score(doc=2736,freq=1.0), product of:
              0.37661085 = queryWeight, product of:
                3.4622724 = boost
                6.4387774 = idf(docFreq=192, maxDocs=44421)
                0.016893832 = queryNorm
              0.50302947 = fieldWeight in 2736, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                6.4387774 = idf(docFreq=192, maxDocs=44421)
                0.078125 = fieldNorm(doc=2736)
        0.16 = coord(4/25)
    
  3. Rötzer, F.: Kann KI mit KI generierte Texte erkennen? (2019) 0.08
    0.07503277 = sum of:
      0.07503277 = product of:
        0.6252731 = sum of:
          0.10753701 = weight(abstract_txt:nächste in 4977) [ClassicSimilarity], result of:
            0.10753701 = score(doc=4977,freq=1.0), product of:
              0.14403448 = queryWeight, product of:
                1.0705774 = boost
                7.963798 = idf(docFreq=41, maxDocs=44421)
                0.016893832 = queryNorm
              0.74660605 = fieldWeight in 4977, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                7.963798 = idf(docFreq=41, maxDocs=44421)
                0.09375 = fieldNorm(doc=4977)
          0.04627166 = weight(abstract_txt:wurde in 4977) [ClassicSimilarity], result of:
            0.04627166 = score(doc=4977,freq=1.0), product of:
              0.103430316 = queryWeight, product of:
                1.2829914 = boost
                4.7719507 = idf(docFreq=1021, maxDocs=44421)
                0.016893832 = queryNorm
              0.44737038 = fieldWeight in 4977, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                4.7719507 = idf(docFreq=1021, maxDocs=44421)
                0.09375 = fieldNorm(doc=4977)
          0.47146446 = weight(abstract_txt:trainiert in 4977) [ClassicSimilarity], result of:
            0.47146446 = score(doc=4977,freq=2.0), product of:
              0.38582805 = queryWeight, product of:
                2.477974 = boost
                9.216561 = idf(docFreq=11, maxDocs=44421)
                0.016893832 = queryNorm
              1.2219548 = fieldWeight in 4977, product of:
                1.4142135 = tf(freq=2.0), with freq of:
                  2.0 = termFreq=2.0
                9.216561 = idf(docFreq=11, maxDocs=44421)
                0.09375 = fieldNorm(doc=4977)
        0.12 = coord(3/25)
    
  4. Kartoo visualisiert Suchbegriffe (2003) 0.07
    0.06745701 = sum of:
      0.06745701 = product of:
        0.5621418 = sum of:
          0.14606714 = weight(abstract_txt:märz in 2406) [ClassicSimilarity], result of:
            0.14606714 = score(doc=2406,freq=1.0), product of:
              0.12566963 = queryWeight, product of:
                7.438788 = idf(docFreq=70, maxDocs=44421)
                0.016893832 = queryNorm
              1.1623106 = fieldWeight in 2406, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                7.438788 = idf(docFreq=70, maxDocs=44421)
                0.15625 = fieldNorm(doc=2406)
          0.037181973 = weight(abstract_txt:auch in 2406) [ClassicSimilarity], result of:
            0.037181973 = score(doc=2406,freq=1.0), product of:
              0.06359558 = queryWeight, product of:
                1.0060354 = boost
                3.7418423 = idf(docFreq=2862, maxDocs=44421)
                0.016893832 = queryNorm
              0.58466285 = fieldWeight in 2406, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                3.7418423 = idf(docFreq=2862, maxDocs=44421)
                0.15625 = fieldNorm(doc=2406)
          0.3788927 = weight(abstract_txt:meta in 2406) [ClassicSimilarity], result of:
            0.3788927 = score(doc=2406,freq=1.0), product of:
              0.37661085 = queryWeight, product of:
                3.4622724 = boost
                6.4387774 = idf(docFreq=192, maxDocs=44421)
                0.016893832 = queryNorm
              1.0060589 = fieldWeight in 2406, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                6.4387774 = idf(docFreq=192, maxDocs=44421)
                0.15625 = fieldNorm(doc=2406)
        0.12 = coord(3/25)
    
  5. Wolfram Language erkennt Bilder (2015) 0.07
    0.06643408 = sum of:
      0.06643408 = product of:
        0.415213 = sum of:
          0.018590987 = weight(abstract_txt:auch in 2872) [ClassicSimilarity], result of:
            0.018590987 = score(doc=2872,freq=1.0), product of:
              0.06359558 = queryWeight, product of:
                1.0060354 = boost
                3.7418423 = idf(docFreq=2862, maxDocs=44421)
                0.016893832 = queryNorm
              0.29233143 = fieldWeight in 2872, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                3.7418423 = idf(docFreq=2862, maxDocs=44421)
                0.078125 = fieldNorm(doc=2872)
          0.08024919 = weight(abstract_txt:cloud in 2872) [ClassicSimilarity], result of:
            0.08024919 = score(doc=2872,freq=1.0), product of:
              0.13381632 = queryWeight, product of:
                1.0319041 = boost
                7.676116 = idf(docFreq=55, maxDocs=44421)
                0.016893832 = queryNorm
              0.5996966 = fieldWeight in 2872, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                7.676116 = idf(docFreq=55, maxDocs=44421)
                0.078125 = fieldNorm(doc=2872)
          0.038559716 = weight(abstract_txt:wurde in 2872) [ClassicSimilarity], result of:
            0.038559716 = score(doc=2872,freq=1.0), product of:
              0.103430316 = queryWeight, product of:
                1.2829914 = boost
                4.7719507 = idf(docFreq=1021, maxDocs=44421)
                0.016893832 = queryNorm
              0.37280864 = fieldWeight in 2872, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                4.7719507 = idf(docFreq=1021, maxDocs=44421)
                0.078125 = fieldNorm(doc=2872)
          0.2778131 = weight(abstract_txt:trainiert in 2872) [ClassicSimilarity], result of:
            0.2778131 = score(doc=2872,freq=1.0), product of:
              0.38582805 = queryWeight, product of:
                2.477974 = boost
                9.216561 = idf(docFreq=11, maxDocs=44421)
                0.016893832 = queryNorm
              0.72004384 = fieldWeight in 2872, product of:
                1.0 = tf(freq=1.0), with freq of:
                  1.0 = termFreq=1.0
                9.216561 = idf(docFreq=11, maxDocs=44421)
                0.078125 = fieldNorm(doc=2872)
        0.16 = coord(4/25)