10/26/2025
5 min read

How to Get Noticed in AI Searches?

How to Ensure AI Systems Recognise Your Website and Recommend It in Search Results.

Written by
Mikołaj Drużkowski
Founder

In the era of new browsers from OpenAI, Perplexity, and Google's AI Overview feature, you're wondering how to make it easier for AI to find your website. How to make AI systems see your site and recommend it in search results. And you've probably stumbled upon llms.txt somewhere on the internet.

It looks like a must-have. But is it really?

What is llms.txt?

It's a simple text file that points AI systems to the most important content on your site. It's placed in the root directory at yoursite.com/llms.txt.

The structure is very simple:

# Your Company Name
> Brief description
## Most Important Pages
- [Product](link): description
- [Pricing](link): description


Everyone says it's the new standard. But there's one problem. Nobody knows if it actually works.

The Problem with Confirming Effectiveness

None of the major AI companies (OpenAI, Google, Anthropic) has officially confirmed that they use these files. It's a community idea that might catch on. Or might not. And here's the question: is it worth investing time in something that might just be a trend?

I see companies adding this file and waiting for improved visibility. Nothing happens. Why? Because they have completely different, fundamental problems.

What Actually Affects Visibility in AI Search?

After working with dozens of companies, I see recurring mistakes that actually impact how AI understands and presents websites.

Content inaccessibility for bots. The site loads in 8 seconds, while AI bots wait a maximum of 3-4 seconds. Or the main content appears only after clicking, executed by JavaScript. Simple test: disable JavaScript in your browser. If you don't see key content, AI doesn't see it either.

Outdated information. AI finds a post from 2022 and treats it as a current offer. ChatGPT might recommend products you no longer sell because an old blog post is still online. AI doesn't know what's current if you don't keep your content organized.

Unclear communication. Sentences like "we offer innovative business solutions" sound professional but say nothing concrete. AI tries to guess from context. Often incorrectly. A company creating software for restaurants might be described as food delivery.

Technical blocks. Many companies accidentally block AI bots in their robots.txt file. Check if you're not blocking GPTBot (ChatGPT), ClaudeBot (Claude), or Google-Extended (Gemini).

Where to Start with Optimization?

The foundation is basic stuff. May sound boring, but these are what actually work.

Loading speed. The site should load in a maximum of 3 seconds. Check this in Google PageSpeed Insights. A score below 50 points signals a serious problem.

Content accessibility. Main information must be visible immediately, without the need to execute JavaScript or click buttons. Most AI bots won't perform these actions.

Message clarity. The first sentence on your homepage should directly communicate who you are, what you do, and for whom. Instead of "innovative solutions," write specifically "CRM for small retail businesses."

Content freshness. Regular updates, at least once per quarter, show AI that information is fresh. If the last post is from a year ago, it signals the site might be outdated.

Without these fundamentals, llms.txt won't help at all. It's like buying expensive tires for a car with a broken engine. The tires might be great, but the car still won't drive.


When Does llms.txt Actually Make Sense?

There are a few situations where this file might bring benefits.

Large platforms with hundreds of documentation pages, multiple product versions, or extensive knowledge bases. There, AI can get lost in the structure and llms.txt can point to the right starting point.

But if you run a typical company with 10-20 subpages and a clear structure, this time is better spent on other things. Really.

Questions You Should Ask Yourself

Instead of focusing on llms.txt, check the fundamentals.

Does your site load quickly? If not, that's priority number one.

Is content visible without JavaScript? Disable it and check what remains.

When did you last update content? If it was long ago, start there.

Do you clearly communicate what you do? Show the site to someone outside your industry. They should understand it in 10 seconds.

If you don't know the answers to these questions or the answer is "probably not," forget about llms.txt for now. You have more important things to sort out.

What About llms.txt?

If you have 15-20 minutes and want to add it, it won't hurt. There are even generators that make creating this file easier. You list a few key pages, add short descriptions, upload to the server, and you're done.

But it shouldn't be at the top of your priority list. It's something you do at the end when everything else is already working. Not at the beginning, not instead of fundamentals, only as a supplement.

Summary

The tech industry loves new tools and standards.

Maybe llms.txt will stick around. Maybe in a year everyone will forget about it.

The best strategy is to do the basic things well. Fast site, clear message, current content, accessibility for bots. This worked before llms.txt and will work regardless of what's trendy in a year.

And llms.txt? If you have fifteen minutes and everything else is already working, add it. It won't hurt. But don't treat it as a solution to visibility problems in AI search.

Because it's not.

Mikołaj Drużkowski
Founder

I code professionally, I'm passionate about new tech, and when I'm not working, I'm probably taking photos.

Got a project in mind? Let's talk