❌

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

Why AI language models choke on too much text

Large language models represent text using tokens, each of which is a few characters. Short words are represented by a single token (like "the" or "it"), whereas larger words may be represented by several tokens (GPT-4o represents "indivisible" with "ind," "iv," and "isible").

When OpenAI released ChatGPT two years ago, it had a memoryβ€”known as a context windowβ€”of just 8,192 tokens. That works out to roughly 6,000 words of text. This meant that if you fed it more than about 15 pages of text, it would β€œforget” information from the beginning of its context. This limited the size and complexity of tasks ChatGPT could handle.

Today’s LLMs are far more capable:

Read full article

Comments

Β© Aurich Lawson | Getty Images

Microsoft built a PC that can’t run local apps

19 November 2024 at 05:30

Prefer to offload all your Windows tasks to the cloud? Microsoft may just have the compact, desk-bound computer for you. On Tuesday at Microsoft Ignite 2024, the tech giant unveiled Windows 365 Link, a fanless, lightweight PC that connects to Windows 365. Windows 365 is a cloud-hosted, virtual Windows machine β€” like a typical Windows […]

Β© 2024 TechCrunch. All rights reserved. For personal use only.

❌
❌