Technology

Using Artificial Intelligence for these things can land you in big trouble

Usage of AI in today’s life is increasing rapidly. We can already see impact of Artificial intelligence in numerous forms ranging from predicting behavioural targeting to weather forecast.

Besides this AI is often seen as a tool that effortlessly predicts the probabilities of global events. There are policies which limits AI to promote or participate in any form of gambling activities. But recently, questions have arisen about whether AI is even advising on illegal activities like betting. According to a Senate report, when ChatGPT and Gemini were asked which football team to bet on next week, both suggested that the Ole Miss vs. Kentucky match might be a better choice.

They even predicted Ole Miss to win by 10.5 points. However, the actual outcome was different, and the team won by only 7 points. The real concern isn’t that the prediction was wrong, but rather why the AI ​​even recommended betting when it’s illegal.

Experts Speak

Professor Yumei He of Tulane University conducted an interesting experiment on this topic. She first asked for gambling advice and then asked about addiction. In this situation, the AI ​​prioritized the initial question and recommended continued gambling. A new chat was initiated. The topic of addiction was discussed first. The chatbot flatly refused to recommend gambling. This clearly demonstrates that AI behavior is highly dependent on the order and context of the conversation.

Experts believe that AI’s security layer weakens during long conversations. OpenAI itself has acknowledged that its safety features work best in short chats. During longer conversations, Artificial intelligence can sometimes give incorrect answers based on past questions. This is why, in sensitive situations like gambling, it can inadvertently offer advice. This advice could negatively impact those struggling with addiction.

Also read: AI developed by University of Colorado Spots Fraudulently Published Journals

Researcher Kasra Ghaharian is concerned about AI’s language. It sometimes uses gambling-promoting terms like “bad luck.” This language could further mislead those suffering from addiction. In fact, AI often provides answers based on probabilities rather than facts, making its output misleading.

This is why experts are emphasizing the need to prevent its use in the betting industry. If not controlled in time, AI could become a means of further promoting gambling and betting in the future. Therefore, if you are using AI for betting, be cautious, as it could land you in serious trouble.

For all latest updates Follow theviralmail on XFacebook, and Instagram

Theviralmail

Recent Posts

Rupee Under Siege: Iran Conflict and US-Israel Strikes Fuel Oil Surge, Inflation Fears Grip India

March 2, 2026 | New Delhi – The Indian rupee plunged sharply on Monday. The…

3 weeks ago

Is BCCI Financial Power Over Powering PCB and BCB?

BCCI's financial muscle shapes global cricket dynamics. It often pressures boards like Pakistan's PCB and…

2 months ago

HAL Pushes Back on IAF Concerns: Ready for Tejas Mk-1A Deliveries?

Hindustan Aeronautics Limited (HAL) recently highlighted progress on its Tejas Mk-1A fighter jets via social…

2 months ago

India-US Tariff Cut to 18%: 5 Key Unresolved Questions

U.S. President Donald Trump and Indian Prime Minister Narendra Modi have revealed a significant drop…

2 months ago

ISRO’s Anvesha Launch: Elevating India’s Space Supremacy

India's space agency ISRO marks 2026's opening with the PSLV-C62 mission. The mission involves deploying…

2 months ago

Trump’s Foreign Policy: Risks and Global Consequences

Trump’s mix of aggressive tariffs, financial deregulation, and confrontational diplomacy is increasing global economic uncertainty…

3 months ago

This website uses cookies.