Should your business let the AI search crawlers in?

Companies are grappling with a new question posed by the rise of conversational artificial intelligence tech such as ChatGPT: should they let the chatbots’ web crawlers access their sites?

Close up of a person using a computer search engine

It’s a conundrum that’s likely to form the basis of MBA case studies in years to come. Imagine you run a firm that’s successfully selling a product or service. Like most modern companies, it relies on online referrals via traditional search engines for a significant chunk of its business. This model is then disrupted by a new wave of AI-powered tools that also crawl the web in their quest for answers.

Giving them access to your website could ensure that your firm will be mentioned in the responses that OpenAI’s popular ChatGPT chatbot, say, gives to user queries such as: “Who makes the best-value widgets in the UK?” But, in doing so, you must hand over some of your precious intellectual property to feed the large-language model (LLM) that powers this technology. You could prevent its web crawler from accessing your site, but in protecting your property this way you’d risk being overlooked in the chatbot’s answers.

So which option should you choose? Do you let ChatGPT and its ilk run freely over your firm’s website or do you shut the crawlers out?