Searches for “AI sports betting predictions” have skyrocketed by 4,000% over the past month, as influencers share their experiences with using AI to bet, and trust in the market continues to grow. Analysts expect the sector to grow rapidly, reaching $29.7 billion by 2032.
While many apps utilise their own AI technologies, relying on outside sources like ChatGPT and Gemini introduces users to new risks, from made-up stats disguised as truths to unreliable financial advice.
Industry experts at VIP Grinders and Christoph C. Cemper, founder of AI prompt management company AIPRM, break down the hidden risks of asking ChatGPT for sports tips and reveal what you need to know before jumping on the bandwagon.
ChatGPT doesn’t know the odds, but it will act like it does
AI tools like ChatGPT don’t truly understand current odds, betting markets or insider statistics. And they’re well-known for regurgitating false information, as recently reported by New York Times. The tools make convincing predictions based on any online information, which might not come from a reliable source.
When betting on reputable websites, the odds and risks are clearly explained. But blindly following advice from platforms like ChatGPT often provides users with a false sense of reliability and confidence. It doesn’t know the odds, but will act like it does.
Sharing financial information with AI is risky
You should never share financial information with AI tools, even when asking for advice on something seemingly simple, like deposit methods or troubleshooting. You’re potentially sharing sensitive financial information whenever you enter numbers like your betting balance, payment methods or preferred platforms.
In 2023, ChatGPT experienced a data breach, exposing 101,000 users sensitive information including social security numbers, email addresses and geographic locations.
Christoph C. Cemper, founder of AIPRM, explains: “OpenAI’s terms and conditions state they may retain your data, including information you’ve chosen to share during your chats, in order to help train their future models. Their terms explain: “We may use the data you provide us to improve our models.” It’s advised to be selective about what you type. If you don’t want it repeated to other users in the future, you should avoid sharing it with ChatGPT.”
AI can push you into riskier bets without you noticing
Experts have recently noted that overuse of AI results in complacent behaviour and can reduce critical thinking. Users should be leading with skepticism when using ChatGPT and Gemini. AI tools can suggest, and even encourage fans to spend more than they’re comfortable with on riskier bets; we need to be able to effectively question and test the credibility of any information it puts forward.
Telma Casaca, Marketing Director at VIP-Grinders, added: “While AI can be a great tool for day-to-day life, letting it make decisions which directly affect your financial situation can be risky. Especially with bets you may not have moved forward with otherwise. The industry is becoming more and more aware of these tools, integrating specific, less risk-averse versions into their offerings. However, consumers need to exercise with caution and bet responsibly – double-checking any claims made by outside sources like ChatGPT and Gemini.”
You could have money withheld and your account banned
Unknown to many using the platforms, the most popular betting sites often prohibit third-party software, including artificial intelligence. For example, Bet365’s terms and conditions state they have the right to: void any relevant transactions, withhold winnings and suspend or close your account. They also note you may be liable to compensate them for any losses due to prohibited activity.