Report — Bovada.lv 2015 — Online Poker In Danger
All of the research and data contained within this report was collected by a team led by the author of this report. All of the software, analysis programs, and statistical models described within this report were custom built by the same team. All reasoning is based off of statistical models, Monte Carlo simulations, and basic data science principles. Due to this style of reasoning, it is possible that results may vary player to player to an unknown degree. However, based on the same reasoning, out of the ordinary occurrences can still be classified as statistical anomalies.
All data and collection methods have been verified by multiple third party sources/experts. This initial report is written from a stance of a professional software engineer, data scientist, and poker player. Some terminology will be defined, but based on the expected consumer of this report, it is expected that most of the advanced terminology will be understood.
Reasons For This Report
This report was started as part of a private contract placed for private bid by a group of businesses. This group placed the bid in order to find advanced technical details on the current status of online poker. These advanced technical details include the security, legality, and fairness of the companies behind the largest online poker rooms available for players in USA. It is believed that this group is attempting to take a legal stance within the United States to start new online poker rooms accessible nationwide. Based on the schedule of the report, these businesses are looking to open these operations near the beginning of Q2 2016.
The internal market research shows a large increase in the number of companies claiming a legal status for software systems that allow users to bet on new forms of advanced games (outside of traditional odds based gambling). The most notable companies found in this category are Draft Kings and FanDuel. Both companies have estimated valuations of over $1 billion. These companies have shown initial actions attempting to break in to similar markets, including online poker. These companies have the benefit of an extremely advanced software engineering team and a large marketing budget. Internal company contacts have confirmed that both of these companies have annual marketing budgets above $100 million. These companies would likely purchase a company that has already been in the online poker space, such as Full Tilt Poker.
As part of the report, Bovada Poker was determined to be one of the top three online poker rooms currently available to US citizens. Bovada Poker is considered safe for American players across over 40 states and offers reliable forms of deposits and withdrawals. This legality is based off of the international status of the holding company. Online poker companies based in the United States are granted legal status on a state by state standing. There are a very limited number of poker companies in the United States that hold any form of legal status. There are even fewer that have complete, nationwide legal status.
One note on this legal status, Bovada was withdrawn as a legal gaming product for players in a hand full of states including Delaware and Nevada. This came after the claim that these states started requiring online betting companies pay an upfront tax and a recurring fee on proceeds from players within the state. These large fees caused Bovada to shy away from the states, and stop play for users within these areas. Based off of conversations with state representatives inside some of the largest online gambling states, it is possible that more states will follow suit in order to collect taxes from online gambling that takes money out of each state.
Bovada Poker was established as a continuance of the Bodog Poker brand by the Mohawk Morris Gaming Group (MMGG) of Canada. According to reports and multiple articles, it is believed that MMGG has had all of their gambling software security tested, fairness tested, and gaming control tested. Since Bovada Poker is not located within the United States, it is confirmed that the software has not readily been verified by the Nevada Gaming Commission or related American gaming control boards. Information on the verification processes undertaken on the software at Bovada is unclear and full descriptions do not publicly exist.
Due to this, our team had an initial belief that Bovada’s poker room software may have potential security risks not checked by adequate control boards. The research described within this report will statistically confirm this belief and show that Bovada Poker is operating an unfair poker room (even while passing modern gambling software verification tests).
This fairness issue could be linked to a rogue employee(s) or hacker group that may have compromised the random number generator and shuffling algorithms used by the poker room software. It is also possible that this is a coincidental mistake made by the initial programmers of the random number generator/shuffling algorithms. Another, less likely possibility, is that Bovada is knowingly using their own software loophole to steal large amounts of user money. This could be related to a similar situation that happened to Absolute Poker and Ultimate Bet between 2005 and 2008 (History). Likewise, this report shows that common cheating practices such as collusion and bot play are occurring frequently without any security intervention.
Following this initial public circulation report, a secondary full length report will be released on a case by case basis for private use. The secondary report will include full software sources, data access, and image/video evidence collected throughout the testing phases.
Please feel free to contact our team at firstname.lastname@example.org for further details and requests.
The team behind this research is comprised of
- 1 data scientist (15+ years professional experience)
- 1 gambling software developer (helped build one of the largest online gambling software systems available)
- 1 software engineers (5+ years of professional experience each)
- 1 mathematician (PhD in Statistics)
The team was constructed based on software and mathematics credentials, experience within the gambling software industry, and experience testing fairness within a real life statistical model. Previous research completed by this team includes a user behavioral analysis tool that was constructed for an older online poker room that was capable to determining collusion by multiple tables, even with zero background connections (IP Addresses, Locations, Names, Software Hints, etc). This previous research was packaged and sold to 4 of the top 10 United States based online poker rooms prior to 2011.
The team was constructed from top industry professionals within Silicon Valley, San Francisco, New York City, and Ontario. These individuals were screened for biases towards Bovada prior to beginning work. The team members span industries including gaming, advertising, data mining, open source software development, security, and database solutions.
As part of the initial research for this project, multiple online forums and gambling related social networks were combed for stories based on the potential unfairness of the Bovada Poker room. Based on the data collection of public posts within the past year, 1,461 separate stories of single or multi hand suspicious activity at Bovada Poker were found, collected, and verified.
A large amount of the stories collected described single hand bad beats and multi big hand pot losses. A bad beat is defined as a hand in which one player has a substantial odds advantage over other players, only to be beaten by late drawn community cards. Surprisingly, when contacted directly, a lot of the players behind these stories also provided saved hand histories (data saved on the Bovada Poker software or locally on their own computers) confirming their stories.
A multi big hand is a situation in which a single player wins multiple large pots in which multiple players have complete hands of a straight or above within a short period of time at a single table. Another form of a multi big hand is considered to be when multiple players receive face card pocket pairs (Jack/Jack, Queen/Queen, etc.) within the same hand, forcing a large series of early bets.
The majority of the stories revolve around the system producing an unfair outcome on a large number of hands. This unfair outcome tended to be in the form of one player winning large pots through the course of multiple hands that did not seem to follow the odds. The stories did not directly show a belief that there is an unfair advantage of one user over others (referred to as a Super User advantage). This heavy focus on Bovada Poker potentially being unfair has led our team to focus the research on the mass collection of hand histories in order to compare hand probabilities to theoretical averages and see if there are any anomalies.
As described on Bovada Poker’s website, as well as many other poker related websites, bad beats do naturally happen in poker. Statistically, bad beats do widely occur within both online and offline poker rooms. Most players consider bad beats as one player having at least an 80% winning percentage over another player(s).
Based off of simple math, this form of bad beat would still happen in 1 out of every 5 possibilities. Online poker rooms such as Bovada tend to deal a very large number of hands over short periods of time. As recorded in the data collection, Bovada Poker tables can average 60–100 hands an hour. Normal real life casino poker room tables average 20–40 hands an hour at their very best. This large number of hands in a short period of time means that the amount of bad beats will also be large in comparison, even though statistically accurate when compared to the odds. Based off of the data collection, our team will prove that Bovada Poker is dealing these low odds hands far more often than normal odds suggest.
Data Collection — Reasoning
Our team decided to focus on the most commonly played form of poker, Texas Holdem. Bovada Poker offers 9, 6, and 2 player tables for Texas Holdem. Bovada Poker provides play money tables, cash tables, sit-and-go tournaments, scheduled tournaments, and Zone Poker (a fast form of Texas Holdem where every hand is played with a new table of players).
For data collection purposes, our team decided to only record hand history data from normal 6 and 9 player cash tables. Tournament tables and Zone Poker tables are not viewable to non players and also tend to invoke non normal play from users since the chips are not real money. Also, to test full fairness across a large number of hands, it was decided to collect data from all table limits. Bovada Poker offers limits ranging from $0.02/$0.05 all the way up to $30/$60.
Based on the bad beat stories and the wide range of examples posted publicly across multiple different websites and forums, our team decided to focus our attention on the collection and examination of hand data based on the probabilities of different hand types. Based on this belief, our team built multiple software tools that aimed at helping collect, categorize, analyze, and visualize hand data. The different hand types analyzed include:
- Bad Beat - described above.
- Super Bad Beat — a player has over a 99% percent chance of winning a hand and loses to community turn cards. Usually the result of a player hitting 2 specific cards in a row.
- Multi Big Hand - a hand when multiple players at the table end up having a very high ranking hand such as a straight, flush, full house, straight flush, or royal flush.
- Oddball Win - a player over plays a low ranking starting hand and wins against a very high ranking starting hand.
One other defining trait of Bovada Poker is that it is considered an anonymous poker room. Most online poker rooms have usernames and account ID’s connected to players. Having a connection attribute allows players to collect hand histories based around themselves and other individual players, even when moving across multiple tables. At Bovada, this information is not provided. The only defining factor given to a user at any table is their table position number (1–9). Because of this, our team decided that single user intra table data would need to be collected and analyzed separately in order to match hand histories to single users across multiple tables. This data includes details that could define a single user such as:
- Timing on actions — the average time it takes for a user to take an action when it becomes their turn.
- Non uniform bet amounts — some users have a specific betting strategy during certain hand types. Sometimes, the amounts are awkward in comparison to the table blinds.
- EV % — the percentage of hands that a player decides to place money in to the pot (calls, raises), not including being forced to place blinds.
- Bluff % — the percentage of completed hands that the user attempts to bluff on.
- Win % vs. Play % — the percentage of complete hands the user wins compared to the percentage of hands the user plays through completely.
- Multiple smaller comparisons. Sit out times, changes in bet styles, tilt timing, etc.
Data Collection — Development
Once the reasoning behind the data collection was set, our team started work on the architecture of the data collection system that would be attached to Bovada Poker - Texas Holdem - cash tables. The base level software goal was to collect all hand data and the resulting shift of money within a single hand. The first step of this architecture was to find a way to collect hand history data when the user watching the table is not a sitting player.
Once this development was complete, our team started setting up a stream on every poker table available, monitoring them for inaccuracies or problematic reporting and building analysis systems from the data collected. The collection process was perfected fairly quickly, and allowed for seamless data collection from all cash Texas Holdem tables for months on end, with little to no interference. The data was also correlated with consistent screenshots taken at the end of hands in order to corroborate the data with a visual check.
As a security note on the data collection process, our team was able to set up the data collection streams and stay at a single table endlessly until the table was closed and a new table was created. Usually, from most other online poker room applications our team has studied, there is a limit to the amount of time a non player can watch a table before they are kicked back to the main lobby. The low amount of actual software security steps taken at Bovada is a troubling sign of a lack of security protocols and checks. This could potentially signify a lack of general security robustness and could explain why potential security breakdowns are not noticed by the team.
Data Collection — Initial Results
After almost a year of data collection, over 20 million hands were collected and stored in a central database. Within these hands, over 1.4 million separate intra table players were recorded (meaning if a person takes a seat at a table, they are counted as a single player, until they leave the table). Therefore, from those 1.4 million players, there are far less individual players, most likely within the upper 10,000's. It does however mean a large amount of players play different tables at different times consistently, even playing multiple tables at a time.
All of this data was then aggregated into multiple separate collections based on analysis groups. These analysis groups were formed to break up data in to a form that can be tested for a distinct outcome. These separate checks were primarily different types of hands that the research is based on (bad beats, big hands, etc listed above). These groups then made it very simple to scan the central database and match different hand types to distinct players at a table utilizing timestamps, table Id’s, and hand Id’s.
From the over 20 million hands collected, approximately 124.2 million user actions were recorded (simple system messages not attached to a player were filtered out in preprocessing). The saved actions include all bets, raises, calls, blinds, users entering a seat, users leaving a seat, and so on. These actions were digested in to several different user analysis tables that connect these actions to each user, and model user behavior over multiple hands. Machine learning algorithms were built for determining trends that could match a single player seated at different tables together (trying to guess when players at different tables are the same person).
All of these actions were also used to look for trends related to specific hand types. The team decided that odd results included players constantly overplaying small percentage/losing hands and winning. This type of play can signal that a player has an unfair knowledge of other players’ cards or the community cards. Similarly, an analysis system was built around the connection between hand and player action data in order to find unusual streaks, strange multi hand outcomes, and other out of the ordinary play.
The system used for user behavior analysis was a time window, play style algorithm. Based on a set of multiple characteristics (aggression level, percentage of hands played, win percentage, etc), a user can be classified as a specific type of player over a statistically significant number of hands (a time window). By judging play style over every set number of hands (every window), it becomes easy to find large changes in play style that occur quickly (adjacent windows).
These large changes, such as a normally very tight player becoming extremely aggressive, signify either a change in the player’s state of mind (such as tilt) or a point in which the user may have an unfair advantage over other players through some piece of unknown information. This form of player behavioral analysis is commonly used in real life casinos to locate players cheating at table games. The most well known cheating system that would get caught by this analysis is black jack card counting. Card counting requires a player/group of players make a significant change in play when they determine that there is a large change of odds in their favor (an edge case change).
Data Collection — Analysis
After all of the data was collected and the initial analysis systems were built, our team started to look at the final results. The results were in the form of totals of certain types of actions/hands, areas of data where an anomaly could possibly have taken place, and connections between actions and hand data that could signify unfair advantages.
From the base level data returned, there were very few situations that our analysis systems were able to pick up suspicious user behavior regarding player betting styles. As related to past poker room cheating scandals (see Ultimate Bet/Absolute Poker above), suspicious users were found on those platforms by the use of full hand data only available to the poker rooms themselves. This full hand data has lists of all cards on the table and would allow an analysis program to find every location where a user may possibly have had an unfair knowledge of other players’ cards.
The statistical comparisons made to look for the super user cheating usually revolved around two metrics. The first metric is the number of Big Blinds won over 100 consecutive hands. This means that at a table where the Big Blind is $1, the metric would be calculated by how many dollars you are above or below what you started with. A professional poker player usually averages 8 to 10 Big Blinds in winnings over 100 hands. The second metric used is the level of River Aggression. This metric is calculated by looking at the play style on the river card. In the Absolute Poker cheating case, cheating players had almost infinite aggression on the river card, only betting or folding, never calling.
From our base analysis, without the full hand data as described above, what occurred at Absolute/Ultimate Bet is not occurring on Bovada Poker in a noticeable way (due to the nature of the cheat, if it’s not noticeable, it’s most likely not happening). It was very difficult to find situations where one player at a table was able to continuously win over other players in a majority of hands, barring some outlying cases that will be described below. However, based on the idea that a person working against a poker room would know about past exploits, our team decided to look at edge casing as a potential point of interest.
Edge casing is defined as an attack on an algorithm (usually in a mathematical form) that looks for and targets extreme cases separately from normal cases. Edge casing is commonly used by exploiters and encryption developers to find/make back doors in to an encrypted system. The idea behind edge casing is that instead of trying to win every hand, a player can find a small number of extreme cases and take advantage of them in their favor in order to make a few extremely large wins that far outweigh any losses.
This is similar to the strategy developed by black jack card counters. The technique of edge casing works well in online poker cases since it is almost impossible to detect because an edge case appears to be an average low odds hand that should statistically occur every so often. The detectability depends on the greed of the individual and how often/large their winnings become. Again, looking at the average number of Big Blinds won over 100 hands, an edge case where a player wins a substantial all-in pot could mean a 40–50x increase in that total. If a few of those large edge case hands occur within 100 hands, that could mean a 100 or more Big Blinds won over 100 hands. This amount would be a dozen standard deviations from the average of even a professional poker player, making its odds the equivalent of a few million to one lottery.
However, there are several differences between black jack card counters and an online poker room edge case scenario. First, online poker rooms make money off of table rakes. A rake is a small percentage of a total pot that is taken by the poker room after every post flop hand. Since the outcome of the hand does not benefit or hurt the poker room, the poker room has no reason to stop edge case hands that would produce a much higher than average rake since players would tend to place a larger amount of money in to the pot more often.
Second, online poker rooms depend on a random number generator algorithm designed by a mathematical equation (pseudo random). That same algorithm could have also been designed to produce edge cases at predictable times, without dramatically altering the base algorithm. It is also possible that a mathematical mistake could have left the random number generator with a sub sectional program that creates these edge cases by accident. This would most commonly come in the form of a sub algorithm designed to fix a problematic section of the random number generator that is producing non random outcomes (trying to alter outcomes to make sure of randomness can create non random effects).
A perfect place to hide edge casing tends to be in random number generators because of their size (usually tens of thousands of lines of code) and their use of a wide variety of different mathematical formulas used to ensure randomness. The hard part in purposefully making a random number generator produce edge cases is finding a way to mathematically hide it in the overall algorithm while still having it make logical programming sense. The random number generator is one of the most crucial pieces of an online poker room, so it is guaranteed that a group of expert level programmers and mathematicians would scan the source code with a fine tuned comb to verify its fairness. Any individual with a substantial background in encryption and advanced mathematical algorithm theory would be capable of both creating an edge case scenario as well as finding hidden edge case algorithms.
One of the most well known examples of this type of exploit is an individual named Ronald Harris (https://en.wikipedia.org/wiki/Ronald_Dale_Harris) who worked for a software company that produced and tested the technology behind Atlantic City Casino Keno games. The individual was able to write in a special edge case algorithm that would produce a predictable game every few weeks. This edge case allowed the programmer to build a separate program that could predict the numbers of the edge case Keno game by inputting the winning numbers of previous Keno games. By playing the one edge case game and winning the jackpot, the man was able to win over $100,000 by buying a $5 ticket (only to get caught when sending his friend to pick up the winnings).
When looking at the data for potential edge case advantages, the analysis based on hand types (bad beats, big hands, etc) showed some very troubling results. The biggest problem was the high number of multi player complete hands (hands where multiple players receive high value, high percentage winning hands) and how many of those hands had large bets associated with them. For most players, these hands would appear to look like premade hands (where the only winning pocket hand combination is the one your opponent is holding).
From calculated averages across all hands collected, approximately 1 in 18.4 hands resulted in a big hand played out to a 15+ Big Blinds win (one player winning a substantially large pot). To be more exact, the average for a 6 player table was 1 in 20.6 (4.85%), and for a 9 player table it was 1 in 17.9 (5.59%). Out of these hands, over 55% of the hands resulted in players placing 80% or more of their table chip stack in to the pot.
In order to test these numbers, our team built a very simple Monte Carlo simulation based off of the above pictured library that would randomly deal hands of poker and determine the odds of multi player complete hands occurring. All hands in the simulation would be dealt completely through, meaning that all players at the table would have hands that could be counted towards a big hand.
As a separate case study, our team was able to contact poker tracker and purchase a database of approximately 216 million poker hands tracked by their software from Bovada, Bet Online, SportsBetting.ag, America’s Cardroom, and 888 Poker. This data was then parsed, cleaned, and analyzed the same exact way as the hand data our team collected directly from Bovada.
Our results for the Monte Carlo simulation showed, over billions of random hands, that for a 6 player table, a potential big hand occurs 1 in every 14 hands (7.14%) and for a 9 player table, a big hand occurs 1 in every 12 hands (8.33%). These base level odds look shockingly low considering hand odds collected directly from Bovada and the fact that these odds are for every single hand played till the hand (meaning no folding of potential big hands).
After writing in some base level logic for the tests based on calculated averages for folds of specific hands in specific situations (for example, a player would most likely fold a hand of [2, 7] if any type of bet was placed), the tests were run again. The simulation was run again, this time with hands being folded based on a specific type of hand and how a hand plays out in early community cards. It was determined that a big hand for a 9 player table would normally occur 1 in 39.4 hands (2.53%) and for a 6 player table would normally occur 1 in 46.8 hands (2.14%). Even taking the folding logic to the very upper limits of aggressive play, it was determined that the upper limit averages were 1 in 35.7 hands (2.80%) for a 9 player table and 1 in 40.1 hands (2.49%) for a 6 player table.
From these values, it is easy to see that there is a giant difference in the average occurrence of a big hand. Based on variance and the rules of significance, our team decided to test the odds of collecting 20 million hands that would produce equal or greater odds as those collected from the Bovada hand data from the random Monte Carlo simulation. To do so, tens of thousands of smaller Monte Carlo simulations (20 millions hands each) were conducted, and the results were compared as a group.
Even with exaggerated upper limit values on folding (to the point where a player would need to be impossibly aggressive), it was determined that collecting 20 million hands that had similar or larger multi player big hand percentages as the hands collected from Bovada Poker occurred at 9 standard deviations above the mean. This means that the odds of playing 20 million random hands and seeing the number of multi player big hands that our team collected would be roughly 1 in a few 100 trillion. This fact alone proves that something is very far off with the random shuffling at Bovada. Also, to build a direct comparison, our team decided to graph the histogram of the tested odds as well as the odds collected from Bovada and the third party hands databases purchased from Poker Tracker.
Going beyond the big hand data, there was another troubling fact when looking at the user data that was connected to big hands. A majority of big hands (over 50%) occurred within 10 hands of another big hand (10 hands before or after) and over 60% of these multi big hands were won by the same seat/player. This means that a single player would win out on a big hand, only to receive, play, and win another big hand only a few hands later.
Going back to the Monte Carlo simulation, our team tried to test how often this scenario would occur, even in the situation where every player plays every hand through to completion. Based on 20 billion random hands simulated, having two big hands close together (within 10 hands) where the same seat/player “technically” wins the hand occurs once in every 10,486 hands (0.0000954%). Based on the 20 million hands collected from Bovada, over $1.35 million was transferred to players winning these multi big hands.
Finally, the one fact that drove the nail in the coffin of immediately deciding fairness came when the team looked at the user action data connected to the players winning these multi big hands. When looking at the user action data, over 80% of the players winning these hands had dramatic play style switches immediately before the big hands occurred.
These switches were characterized by a very tight and neutral player (not playing many hands, not putting much money in to the pot, not being aggressive on good hands) changing to a loose and aggressive style (lots of betting, playing low percentage hands, etc) within 20 hands prior to a big hand being dealt. This type of switch in a poker player is usually characterized by someone on tilt (an emotional state causing a poker player to act erratically) and in normal circumstances, very rarely results in a winning outcome.
Likewise, based on how almost identical the user behavior signatures are on most of these multi big hands, our user analysis engine predicted that over 1,100 of these hands were won by between 6 and 9 different players (the range is based on the odds of 2 players playing almost identically). Some of the play styles were so specifically tailored to these situations and so close to identical in style, it is assumed that some of the players were bots playing a very specific, programmed pattern to wait for and win these large pot big hands.
When putting all of these data insights together, it can easily be seen that something is not right at Bovada. There are an overly large amount of big hands being played by a small number of users. Through these big hands, a lot of money is being transferred to these users, totaling over $1,000,000 every few months. The behavioral play styles of these players are almost programmatic in nature (signaling that they are potentially bots). The play style changes are fine tuned to play these big hands as if they know that they are going to occur ahead of time. And finally, all of this was proven to be significantly tested and the occurrences are above 6 standard deviations in probability.
The scariest part is that this pattern of recognition is only over a subset of the total number of hands being played at Bovada. It is likely that this same behavior is being played out on other poker types as well as tournaments. With the number of Big Blinds over 100 hands some players are averaging at Bovada’s tables, it is possible that someone could be winning multiple massive ($100,000+) tournaments that occur weekly, bi-weekly, and monthly. The over $1 million in transferred funds recorded by our team could be dramatically higher, especially since these actions could have gone on unnoticed for years.
Data Collection — Prediction Model
To take these conclusions to the next step and to fully test them, our team decided to build a quick tool on top of the data collection system that would predict when a big hand would occur and what player was going to win. This tool was built on the Amazon AWS Machine Learning Library. The AWS Machine Learning library creates a machine learning process that can be trained off of historical data in order to make live predictions based on future API calls.
The initial machine learning system was created and all of the hand history data from the 20 million collected hands as well as the attached analysis insights data was uploaded as a training set. After all of this data was in place, the team added an API call to the data collection system that would predict the odds of a big hand and the player that would win the hand (predicted separately before and after the flop). Along side, the team also attempted to write a prediction component that would predict the community cards and other player cards at the table, as a test reference in accuracy.
Based on 103,407 collected hands recorded with prediction system API calls, the results produced were promising. First, the prediction system for locating big hands was able to predict 2,543 big hands pre flop (within a 2 hand range) and 3,810 big hands post flop (within a single hand) out of 4,632 total big hands. Out of those hands, only 86 false positives were recorded. While predicting big hands, the system was also set to predict the hand winner.
From the 4,631 big hands collected, winners were predicted 41.2% of the time pre flop and 74.5% of the time post flop. As a control, our team made side by side guesses at both points that the prediction algorithm did. The top recorded team member was able to predict the hand winner 20.4% of the time pre flop and 46.7% of the time post flop. This means that the system would be considered successful in predicting big hands and big hand winners, far more than the average player.
On the other side of predictions, our test models for predicting community and player cards were far less successful. The system was able to guess a correct community card (value and suit) 1 out of every 48 cards. This is better than the average of 1 out of 52, but far from a successful system that would add any true value to a player. When predicting player hole cards, the system was able to guess at least 1 card (value, not suit) accurately in 1 out of every 10 hands played to completion and was able to predict both cards (value, not suit) in 1 out of every 35 hands played to completion.
The card prediction systems were very far away from providing any true predictive value to the user. However, the big hand and big hand winner prediction systems could provide immense value to an average player that would want to stay away from losing big hands or go all in on winning big hands. Because of these two dramatic differences in prediction accuracies, despite using the same exact data set, speaks a lot towards the nature of the big hand edge case possibility. These large differences show that it is highly possible to locate these edge cases (even without a complete data set) while the general card shuffling system is still random in nature, passing what would be considered a normal randomness/fairness test.
By utilizing a tool similar to the prediction engine for big hands and winners, a player could substantially increase their odds of winning at Bovada Poker. This system would alert users when there is likely to be a big hand and who the winner is most likely to be. From these insights, players could easily escape potential major losses in hands that would normally trap them in to a big losing hand. From a lot of the public stories written about Bovada, this information would have helped 1000's of people who have lost a total of over $1 million to these types of hands. Depending on how long this situation has actually been going on and since Bovada is one of the oldest brands in online poker, the true total could be $10,000,000+ by now.
After looking at all of the data and the results that we have analyzed above, our team has come to several important conclusions. First, there are an overly large number of big hands occurring at Bovada. Second, a lot of these big hands are being won by a very small number of players, a lot of times, the same player at the same table within only 10 hands of each other. Third, the players winning these big hands are very openly switching playing styles relatively close to an initial big hand win. Fourth, our team was able to make a prediction tool capable of predicting these big hands and their winners accurately enough to add true value to an average player. Finally, there are a lot of separate stories that all corroborate this data with actual player outcomes, but since Bovada is a ‘private’ poker room, no complete data set has ever been collected.
Based on these conclusions, the assumption made is that Bovada Poker is highly unfair. There is a massive problem where users are being edge cased in big hands that they tend to lose a large pot over. The random number generator and shuffling systems behind this would still show up to be completely random and appear to be fair to the average viewer. However, since our team, with even a relatively small data set, was able to accurately predict big hands and their winners in a majority of cases, someone else with inside knowledge would be able to do it with near certainty. This fact alone screams that this would be occurring due to an insider in Bovada, but without a full data set and without further in depth systems, it would be impossible to tell who exactly is at fault and where these large winnings are going.
Playing Analysis — Overview
After coming to the conclusions low odds hands, our team decided that it would be of great value to look in to other forms of potential cheating on Bovada. The first few methods that our team wanted to research were collusion and botting practices. Collusion is a form of cheating where multiple users sit at a single table and use a form of communication to work together as a team to beat out the rest of the table. Botting is the creation of an advanced computer program that plays a specific strategy and can be run continuously on a table with little to no human interaction.
Collusion occurs a lot on online poker, especially since there are many forms of offline communication that can be used to communicate with other players (via phone, text, in person, etc). However, traditionally, this form of cheating is very easy to catch when looking at past hand/table data. By making simple connections between players who play together frequently as well as looking for specific tactics that are commonly used (chip dumping, multi raises to steal blinds, etc), collusion teams tend to stick out like sore thumbs.
Botting on the other hand can be much more complicated to catch. Bots that play specific strategies could appear very similar to an advanced player who is tight but aggressive in style. Botting is known to occur frequently in Limit Poker since the strategy and calculations that go in to placing bets and making calls can all be easily controlled and manipulated by a computer program.
No limit poker tends to be a much more difficult game to build a bot for because of the unpredictability and loss of control in the betting scheme. For instance, in no limit poker, a bot could make a simple prediction that their best hand is worth a specific number of raises. After reaching that level of raises, the bot can call and stop the hand forcing an answer. In no limit poker, players can randomly bet to any limit no matter the situation. Because of this, it becomes much more difficult to catch and control bluffs when comparing a hand to winning odds.
After doing a lot of research in to both cheating areas, our team felt that it would be most effective to start with simple, easy to catch scenarios and work up to more complicated systems. For collusion, this meant that our team would play together across multiple tables within a short period of time and use very simple tactics and communication methods. By keeping the form of collusion simple and the amount of security blocks low, our team would be colluding the same way that most simple teams of players would naturally do. These are the teams that a person would hope that a sophisticated online poker room would catch.
For botting purposes, our team decided to start with the simple program built previously for the testing of the hands data collection and user behavior analysis. From this basic bot system, our team could build in a simple hard coded strategy on top that would play specific hands in a specific way every time. Based on research in to the limit playing poker bots available, the strategy chosen was a simple sliding odds rule. This means that as the odds of winning a specific hand went up, the more of a raise/call the bot was allowed to make.
Going forward, it was decided to also tie in different forms of user behavior analysis that would categorize each player at a table individually by their play style and hand history. By creating basic rules around different categories of users, the sliding odds system could be modified to increase/decrease odds based on the user and the behavior shown during a specific hand. As an example, a single hand where a player goes all in and the bot must make a decision between calling or not with an above average hand, the bot would play the hand differently if a tight/professional player (shark) was playing instead of a loose/aggressive player (gambler).
Playing Analysis — Collusion
The first test of collusion was kept extremely simple in order to try and stick out as a collusion team. The playing team was made up of 3 players. The idea was to separate the team members out across a 6 person table in a mid blinds ($1/$2, $2/$4, $3/$6) no limit holdem table. The team members would collaborate over cell phones and would try and match normal collusion strategies. The main strategies employed were:
- Informing of each other players’ hands. The cards known then helped the team make decisions on whether a certain hand was winnable and what the increased or decreased winning odds were.
- Pushing players out with raise and re raise scenarios across the table. This means that multiple players on the team would raise and re raise in order to steal blinds/initial bets from other players.
- Chip dumping. After the above strategies, to mask the collusion play, sometimes one team member would lose a large portion of their chips to another team member. In order to even the team back out and keep the collusion value high, these players would purposely lose hands to shift chips back to another low stacked team member.
Our team played with minimal security/masking systems in place. Each player on the team played from their home without masking their IP address (within 30 miles of each other). Each player signed in relatively close to the same time, played on a few tables separately for 10–20 minutes, and then joined the same table within 5 minutes of each other. The team then utilized the above strategies for roughly 60–90 minutes (80+ hands) before leaving the table one at a time, separated by at least 5 hands each or until all of the player’s money was lost (no re buys were allowed to reduce losses)
Our team played this strategy 20 times within 3 weeks. Each time, the team went in to a table with approximately $800–1200 in total. Below are the statistics collected through the testing:
a) 7 Large Win Sessions (200%+ Gain) (+$11,545)
b) 3 Medium Win Sessions (50%+ Gain) (+$3,200)
c) 4 Low Win Sessions (0–50% Gain) (+$995)
d) 4 Low Loss Sessions (0–50% Loss) (-$1,040)
e) 2 Full Loss Sessions (100% Loss) (-$2,350)
So, after 20 full sessions in a 3 week period (almost 1 session per day), our team was able to make approximately $12,350 (any amount below $1 won in the sessions was not counted). The number of full loss sessions were low and they were caused by communication and play mistakes early on in the testing process. Throughout the entire process, the team never received any form of security message, account ban, funds seizure, etc. Each team member used the same player account for each session and followed the same procedures each time.
From past experience with poker related security software, it would be safe to say that our team’s actions would throw major red flags for any basic log parsing tool. The blatant chip dumping, unnecessary multi raise blinds stealing, and continuous playing of multiple players together at the same table would be visible even if the hand logs were scanned manually. Due to this, our team came to the conclusion that either Bovada does not have this security software in place or does not care to take action against these players.
To confirm this belief further, our team decided to take our play style records for the 20 sessions and find the defining data traits that would signify collusion efforts. Once these signature traits were pulled out of the data, our team searched for these signatures within the hand data collected in the previous steps. From the over 12 million hands, our team was able to locate 114 table sessions that would be easily classified as collusion. A large percentage of these sessions were fairly long, at least 50 hands. These long sessions show that Bovada was not able to locate these colluding players within a significant number of active hands, so it is unlikely that Bovada was able to locate them at all.
Based on these findings, our team can easily confirm that Bovada Poker is a haven for collusion activity. When visiting tables, it does not take many hands to see tables where players are chip dumping and blinds stealing. For the average player, it would be almost impossible to track because of the anonymous nature of Bovada. However, Bovada should be taking action against these players and should be doing so within the first few instances of collusion activity. Most other online poker rooms have this functionality perfected, and based on the full report that will be released following this initial public report, our team will show how our team was easily caught by other online poker room security (BetOnline, SportsBetting.ag, etc) within 1 session and less than 20 total hands.
Playing Analysis — Bots
Similar to the collusion testing, our team decided to keep the botting techniques fairly simple to start and build up complexity. A simple form of a bot would produce logs almost screaming botting. This simple form of a bot would play a very basic strategy, would follow the strategy exactly, would not try and hide bot behavior by implementing random action times, and finally would utilize the algorithms behind the guaranteed long term winning of limit Texas Holdem Poker.
Using some of the functions located within the Poker.js library, the bot can easily calculate the current hand and tap in to a function that can judge the odds of winning the hand. Following the simple limit Texas Holdem poker algorithms, the bot places a limit dollar value based on the odds of winning the hand. Then, based on the difference between the dollar value and the current bet, the bot decides to check, call, bet, or fold. Following this simple limit value formula, it becomes very easy to build a winning bot system.
Bovada — Online Poker In Trouble
Based on all of the separate tests completed by our contracting team, it is evident that there are massive issues at Bovada Poker. There are issues with the number of multi large hands, collusion teams, and bots. All of these situations were proven through data collection, analysis, and then using the techniques ourselves to increase our ability to win over average players.
In light of this, it does appear that new, larger poker rooms are making a run at the USA poker market. One poker room that our team tested that is considered to be on of the most secure online poker rooms is America’s Card Room. Below are some of the placements on their landing website that show that they place security far above all other features of their poker room.
From our initial testing and report, our team has nothing else to say than Bovada has serious issues. Bovada has far more issues than most other online poker rooms, despite their size and player count. Without changes and without immediate action, it is likely they Bovada will end up closing shop when reports like this reach the average user market through mainstream media stories.
Thank you for reading, and please look out for our team’s other poker reports coming out shortly. Alongside Bovada Poker, our team is creating security reports on:
- America’s Card Room
- Sports Betting.ag
- Black Chip
Following these separate public reports, a large centralized report with access to all open source software tools, collection data, and image/video evidence will be released at once.
Again, if you would like to contact our team for further details, please feel free to reach out to email@example.com.
Enjoy your poker!
- Data Mine Poker Team