Analyse 4D Tab: Show the number of times the numbers have appear in past draws. Red numbers are auspicious numbers while black numbers are not. Click on the number to search that number. To analyse: You can enter the number of draws and press GO OR Select any of the links '1Mth 3Mth 6Mth 9Mth 12Mth' to analyse. 4D Lucky Number Generator. The 4D formula secret code generator is what the lotto buyers check for many times. The 4D prediction chart is the best choice to do mental work and buy the best lucky 4D numbers. These are better than free 4D prediction software. You can check this online of 4D result websites.
- Predict 4d Number Software Free Software
- How To Predict 4d Number
- Predict 4d Number Software
- Predict 4d Number Software Free
TL;DR
Introduction
My good buddy, Ah Seng, approach me with a strange request the other day.
It started from this forum thread on Hardwarezone.
Apparently, the subject, Ms Foyce Le Xuan, could predict winning 4d lottery combinations.
She posted proof of the winnings on her Instagram account week after week.
There was much debate in the forum on the strategies she used to maintain her consistent winning streak.
Although there were contrasting views with regards to her methods, the strong consensus was that she had a piece of 'magical software'.
A programming tool that could generate winning lottery combinations using specialised algorithms.
The Bet
'Oei Terence, if we could get historical data of past winning 4d numbers, I am pretty sure we could build something similar leh!', said Ah Seng.
I was adamant that winning lottery numbers are random events.
If people could devise algorithms to predict future winning numbers using past patterns, it would be all over the news.
Nevertheless, I was keen to disprove his hypothesis.
Machine Learning or Learning Machine?
Using historical data to predict future events sounded like some CSI 'zoom and enhance' AI Technology.
I had no prior knowledge about the obscure field of data science.
Thankfully, there are tons of machine learning resources available online.
The one offered by Google was pretty good - Machine Learning Crash Course.
The Tensorflow website provides extensive resource and documentation as well.
That being said, this exploration into the world of machine learning was definitely not a walk in the park for me.
This exploration into the world of machine learning was definitely not a walk in the park for me.
Requirements
- Python 3
- Google Cloud Machine Learning Engine (optional)
- Sanity
The Setup
- Obtain past winning 4d lottery results from Singapore Pools website
- Data prep and feature engineering
- Training the model and hyper-parameter tuning
- Prediction results
Obtain past winning 4d lottery results from Singapore Pools website
This was when I hit my first roadblock.
If you head over to Singapore Pools 4d Results page, you'll find that they only provide 4d results for the past three years.
With a bit of Google-Fu, I managed to locate a page within their website to check winning numbers for the past 20 years.
The steps taken to extract the data were as follow:
- Download PhantomJS and Chrome Webdriver.
- Set up a virtual environment.
- Install Selenium and BeautifulSoup.
- Create the scraper in Python.
- Save the results to Excel for data prep.
- A whole load of patience.
Data prep and feature engineering
My initial thoughts were that I had very little to work with.
The two features(Number, Date) were not sufficient to train the model accurately.
It turns out that we could actually derive meaningful metrics with a little feature engineering elbow grease.
Why not create binary vectors for numbers 0000 to 9999?
That would be an exteremely bad idea. The computation resources required for such intensive task are expensive. This form of data representation is extremely inefficient.
Instead, we should approach the problem using sparse representation.
The data is split between 70% (Training Data), 25% (Test Data) and 5% (Evaluation Data).
Predict 4d Number Software Free Software
Training the model and hyper-parameter tuning
The model trained for 100 times using Gradient Descent and Adam optimisers.
Each round took between 15 to 25 minutes on my MacBook Pro.
I could leverage on Google's Cloud Machine Learning Engine to cut my training time by half but I'm a cheapskate.
These were the ideal hyper-parameter settings after 100 tests:
ID | Optimizer | Learning Rate | Epochs | Batch Size | Dense Layers | L2 Regularization | Dropout |
---|---|---|---|---|---|---|---|
1 | Adam | 0.001 | 301 | 16 | 16 | 0.001 | 0 |
2 | Adam | 0.0001 | 301 | 16 | 16 | 0.001 | 0 |
3 | Adam | 0.0005 | 301 | 16 | 16 | 0.001 | 0 |
4 | Gradient Descent | 0.001 | 301 | 16 | 16 | 0.001 | 0 |
5 | Gradient Descent | 0.0001 | 301 | 16 | 16 | 0.001 | 0 |
6 | Gradient Descent | 0.0005 | 301 | 16 | 16 | 0.001 | 0 |
7 | Adam | 0.001 | 301 | 16 | 16 | 0 | 0.5 |
8 | Adam | 0.0001 | 301 | 16 | 16 | 0 | 0.5 |
9 | Adam | 0.0005 | 301 | 16 | 16 | 0 | 0.5 |
10 | Gradient Descent | 0.001 | 301 | 16 | 16 | 0 | 0.5 |
11 | Gradient Descent | 0.0001 | 301 | 16 | 16 | 0 | 0.5 |
12 | Gradient Descent | 0.0005 | 301 | 16 | 16 | 0 | 0.5 |
The results based on ideal hyper-parameter settings.
Training Metrics | Results |
---|---|
Epoch 000: Loss: 729.569, Accuracy: 33.531% | |
Overfitting | Epoch 000: Loss: 81.530, Accuracy: 39.018% |
Overfitting | Epoch 000: Loss: 45.983, Accuracy: 40.437% |
Epoch 000: Loss: 6.345, Accuracy: 48.559% | |
Epoch 000: Loss: 3.950, Accuracy: 44.375% | |
Epoch 000: Loss: 19.132, Accuracy: 43.984% | |
Epoch 000: Loss: 143.438, Accuracy: 36.137% | |
Overfitting | Epoch 000: Loss: 301.922, Accuracy: 29.390% |
Epoch 000: Loss: 60.548, Accuracy: 35.370% | |
Epoch 000: Loss: 8.623, Accuracy: 44.781% | |
Epoch 000: Loss: 10.269, Accuracy: 44.650% | |
|
Training Metrics |
---|
Epoch 000: Loss: 729.569, Accuracy: 33.531% |
Overfitting Epoch 000: Loss: 81.530, Accuracy: 39.018% |
Overfitting Epoch 000: Loss: 45.983, Accuracy: 40.437% |
Epoch 000: Loss: 6.345, Accuracy: 48.559% |
Epoch 000: Loss: 3.950, Accuracy: 44.375% |
Epoch 000: Loss: 19.132, Accuracy: 43.984% |
Epoch 000: Loss: 143.438, Accuracy: 36.137% |
Overfitting Epoch 000: Loss: 301.922, Accuracy: 29.390% |
Epoch 000: Loss: 60.548, Accuracy: 35.370% |
Epoch 000: Loss: 8.623, Accuracy: 44.781% |
Epoch 000: Loss: 10.269, Accuracy: 44.650% |
Epoch 000: Loss: 4.455, Accuracy: 44.737% |
A few take-aways based on the results.
- In general, the Adam optimisation algorithm performed much better than Gradient Descent.
- Accuracy was most affected by the learning rate as compared to the rest of the parameters.
- Adding L2 regularization and Dropout yielded minimal improvements.
Prediction results
The moment we've all been waiting for!
Based on the trained model, we predict the probability for a set of numbers from the evaluation data.
Version | Predictions |
---|---|
1 | Example 0 prediction: first_prize (29.9%) |
Example 1 prediction: second_prize ( 0.2%) | |
Example 2 prediction: starter_prize ( 0.5%) | |
2 | Example 0 prediction: third_prize ( 0.0%) |
Example 1 prediction: second_prize ( 0.1%) | |
Example 2 prediction: starter_prize ( 0.0%) | |
3 | Example 0 prediction: third_prize ( 0.0%) |
Example 1 prediction: second_prize ( 0.0%) | |
Example 2 prediction: starter_prize ( 0.0%) | |
4 | Example 0 prediction: consolation_prize (41.5%) |
Example 1 prediction: consolation_prize (41.0%) | |
Example 2 prediction: consolation_prize (41.4%) | |
5 | Example 0 prediction: starter_prize ( 4.6%) |
Example 1 prediction: starter_prize ( 4.6%) | |
Example 2 prediction: starter_prize ( 4.6%) | |
6 | Example 0 prediction: starter_prize ( 4.5%) |
Example 1 prediction: starter_prize ( 4.5%) | |
Example 2 prediction: starter_prize ( 4.5%) | |
7 | Example 0 prediction: third_prize ( 4.3%) |
Example 1 prediction: second_prize (11.7%) | |
Example 2 prediction: starter_prize ( 0.1%) | |
8 | Example 0 prediction: third_prize ( 0.0%) |
Example 1 prediction: second_prize ( 0.3%) | |
Example 2 prediction: starter_prize ( 0.0%) | |
9 | Example 0 prediction: starter_prize ( 4.6%) |
Example 1 prediction: starter_prize ( 4.6%) | |
Example 2 prediction: starter_prize ( 4.6%) | |
10 | Example 0 prediction: starter_prize ( 6.0%) |
Example 1 prediction: starter_prize ( 6.1%) | |
Example 2 prediction: starter_prize ( 6.0%) | |
11 | Example 0 prediction: starter_prize ( 4.6%) |
Example 1 prediction: starter_prize ( 4.6%) | |
Example 2 prediction: starter_prize ( 4.6%) | |
12 | Example 0 prediction: starter_prize ( 4.6%) |
Example 1 prediction: starter_prize ( 4.6%) | |
Example 2 prediction: starter_prize ( 4.6%) |
As you can see, the results were unsurprising.
- None of the predictions was satisfactory.
- The probability was low even though the corresponding test set accuracy was high.
- Models that overfit produced 0% or close to 0% probability as expected.
Conclusion
If you do a quick google search for the terms '4d prediction singapore', you’ll find no shortage of 4d prediction websites.
These predictions are often served with a healthy dose of snake oil.
The draw process, engineered by Singapore Pools includes numerous variables to deter fraud. Trying to predict winning combinations would be a fool's errand.
To make an almost accurate prediction, you have to factor in draw machine configurations and draw ball weights, as well as unseen elements like the force of the jet air, atmospheric noise and gravitational pull.
Punters believe that there are patterns to lottery numbers which can help increase the probability of winning.
As with all things in life, you will start to see patterns by over-analysing any situation.
I have no answers as to how Ms Foyce predicted her winning numbers but it is most likely not through a magical black box algorithm machine.
As many have said, she could have bought the tickets in bulk and posted the winning combinations after the results were announced.
There are many ways to achieve internet notoriety you know?
Fun fact: Your money is better spent elsewhere. Wealth gained hastily will dwindle, but whoever gathers little by little will increase it. - (Proverbs 13:11-13)
Wealth gained hastily will dwindle, but whoever gathers little by little will increase it. - (Proverbs 13:11-13)
Disclaimer: I am not a data scientist, neither am I trained in the field of machine learning. Several assumptions were made while training the models. If you are a data scientist, please feel free to chime in on the comments section.
How To Predict 4d Number
News & Updates
Predict 4d Number Software
18 Apr 2020 | New imported cases: 0 |
17 Apr 2020 | New imported cases: 1 |
16 Apr 2020 | New imported cases: 0 |
15 Apr 2020 | New imported cases: 0 |
14 Apr 2020 | New imported cases: 0 |
13 Apr 2020 | New imported cases: 0 |
12 Apr 2020 | New imported cases: 0 |
08 Apr 2020 | New imported cases: 2 |
07 Apr 2020 | New imported cases: 3 |
06 Apr 2020 | New imported cases: 1 |
05 Apr 2020 | New imported cases: 4 |
04 Apr 2020 | Suspension of sales from SingaporePools on all online and offline channels. Last 4D draw on 05 Apr and ToTo draw on 02 Apr. |
About House Of ToTo
House Of ToTo is a ToTo & 4D predictions website for SingaporePools Lottery. I started this website in 2nd Nov 2009 and had since made many changes to the website including working with others of similar interest to bring the best to the lottery community in Singapore. However it is not easy to manage a website that serve a niche market and hence I decided to take this back personally and manage it alone.
You can view this website on the desktop or on your mobile and more features will be added accordingly to enhance the experience. But meanwhile lets focus on winning some money from SingaporePools. Thank you.
Talk to me
Predict 4d Number Software Free
Reach me via Email or follow us on social icons below. Thank you.