the porous city |
I was particularly struck by the assertion that “There is no restriction on leaving the wolf and the cabbage together, as the wolf does not pose a threat to the cabbage.” It says this immediately after noting that “you can't leave the wolf alone with the cabbage”. All of this is consistent with the idea that GPT-4 relies heavily on learned patterns. This puzzle must appear many times in its training data, and GPT-4 presumably has strongly “memorized” the solution. So strongly that when it sees a related puzzle, it’s unable to articulate a different solution; the gravitational pull of the memorized solution is too strong .... For a final data point, I started a fresh chat session and restated the puzzle using made-up words for the three items – “I need to carry a bleem, a fleem, and a gleem across a river”. This time, freed from the gravitational pull of the word “goat”, it was able to map its pattern of the known answer to the words in my question, and answered perfectly.On GPT thinking out loud:
GPT-4 is very explicitly using the chat transcript to manage its progress through the subproblems. At each step, it restates information, thus copying that information to the end of the transcript, where it is “handy” ... Here’s one way of looking at it: in the “transformer” architecture used by current LLMs, the model can only do a fixed amount of computation per word. When more computation is needed, the model can give itself space by padding the output with extra words. But I think it’s also a reasonable intuition to just imagine that the LLM is thinking out loud.On the context window as a fundamental handicap:
They are locked into a rigid model of repeatedly appending single words to an immutable transcript, making it impossible for them to backtrack or revise. It is possible to plan and update strategies and check work in a transcript, and it is possible to simulate revisions through workarounds like “on second thought, let’s redo subproblem X with the following change”, but a transcript is not a good data structure for any of this and so the model will always be working at a disadvantage.
last modified: 16:07:16 14-Apr-2023
in categories:Tech/AI
This is Lukas Bergstrom's weblog. You can also find me on Twitter and LinkedIn.
Tech
Hardware, Wearables, Collaboration, RSS, AI, Security, Web analytics, Shopping, PIM, Crowdsourcing, MacOS, Javascript, s60, Business, barcamp, Web, Android, WRX, Medical, Open, Data, Social, Energy, a11y, Storage, Automobile, Development, Audio, OS, Product Management, Mobile, Visual, Net
Other
Transportation, Housing, Sports, Clothes, Feminism, Statistics, CrowdFlower, Personal care, San Francisco, Boston, Geography, Travel, Video, Law, Bicycling, Politik, Podcasts, History, Activism, Minnesota, Life hacks, Agriculture, Food & Drink, California, L.A., Toys, Berlin, Surfing, Quizzes, NYC, Games, Friday
Music
Hip-hop, Mixes, Videos, Events, Streams, Lyrics, Reviews, Booking, Musicians, Making, Boston, Shopping, L.A., History, Business, Good tracks, House, Labels, Mailing lists, Mp3s
People
Vocations, Health, Family, Languages, Gossip, Weblogs, Working with, Enemies, Subcultures, Life hacks, Stories, ADD, MOTAS, Buddhism, Friends, Me, Exercise, Meditation, Heroes
Commerce
Real Estate, IP Law, Insurance, Personal finance, Marketing and CRM, Personal services, Management consulting, Microfinance, Non-profit, Investing, Macroeconomics, Taxes, Web, International Development, Shopping
Arts
iPad bait, Desktop wallpaper bait, Outlets, Burning Man, Sculpture, Humor, Spoken Word, Events, Poetry, Movies, Visual, Animation, Literature, Rhetoric, Comix
Design
Algorithmic, Furniture, User experience, Web, Data visualization, Type, Tools, Process, Architecture, IA, Cool, Presentations
Science
Physics, Environment, Networks, Zoology, Statistics and Data, Psychology, Cognition
Travel
Kenya, Vagabond '08, Kingdom of Siam, Uganda
Comment