2048 (3x3, 4x4, 5x5) AI
ASO Keyword Dashboard
Tracking 97 keywords for 2048 (3x3, 4x4, 5x5) AI in Apple App Store
2048 (3x3, 4x4, 5x5) AI tracks 97 keywords (no keywords rank yet; 97 need traction). Key metrics: opportunity 70.1, difficulty 38.5.
Tracked keywords
97
0 ranked • 97 not ranking yet
Top 10 coverage
—
Best rank — • Latest leader —
Avg opportunity
70.1
Top keyword: strategy
Avg difficulty
38.5
Lower scores indicate easier wins
Opportunity leaders
- 68.5
strategy
Opportunity: 73.0 • Difficulty: 41.7 • Rank —
Competitors: 236
- 68.2
course
Opportunity: 73.0 • Difficulty: 41.1 • Rank —
Competitors: 71
- 67.4
random
Opportunity: 73.0 • Difficulty: 40.9 • Rank —
Competitors: 144
- 66.2
position
Opportunity: 73.0 • Difficulty: 40.0 • Rank —
Competitors: 50
- 64.8
advantage
Opportunity: 73.0 • Difficulty: 39.6 • Rank —
Competitors: 117
Unranked opportunities
strategy
Opportunity: 73.0 • Difficulty: 41.7 • Competitors: 236
course
Opportunity: 73.0 • Difficulty: 41.1 • Competitors: 71
random
Opportunity: 73.0 • Difficulty: 40.9 • Competitors: 144
position
Opportunity: 73.0 • Difficulty: 40.0 • Competitors: 50
advantage
Opportunity: 73.0 • Difficulty: 39.6 • Competitors: 117
High competition keywords
like
Total apps: 142,680 • Major competitors: 2,089
Latest rank: — • Difficulty: 52.5
create
Total apps: 118,650 • Major competitors: 1,432
Latest rank: — • Difficulty: 51.6
best
Total apps: 116,953 • Major competitors: 1,658
Latest rank: — • Difficulty: 51.5
features
Total apps: 115,626 • Major competitors: 1,316
Latest rank: — • Difficulty: 51.4
based
Total apps: 80,023 • Major competitors: 681
Latest rank: — • Difficulty: 49.6
All tracked keywords
Includes opportunity, difficulty, rankings and competitor benchmarks
| Major Competitors | |||||||
|---|---|---|---|---|---|---|---|
| best | 66 | 100 | 51 | 85 116,953 competing apps Median installs: 150 Avg rating: 4.2 | — | — | 1,658 major competitor apps |
| puzzle | 70 | 100 | 45 | 75 29,343 competing apps Median installs: 175 Avg rating: 4.3 | — | — | 552 major competitor apps |
| strategy | 73 | 100 | 42 | 69 12,444 competing apps Median installs: 250 Avg rating: 4.2 | — | — | 236 major competitor apps |
| action | 72 | 100 | 43 | 71 17,905 competing apps Median installs: 150 Avg rating: 4.2 | — | — | 308 major competitor apps |
| art | 71 | 100 | 44 | 73 23,826 competing apps Median installs: 125 Avg rating: 4.2 | — | — | 253 major competitor apps |
| version | 70 | 100 | 45 | 75 29,505 competing apps Median installs: 150 Avg rating: 4.0 | — | — | 226 major competitor apps |
| order | 68 | 100 | 49 | 80 60,070 competing apps Median installs: 75 Avg rating: 4.3 | — | — | 523 major competitor apps |
| environment | 72 | 100 | 42 | 69 13,606 competing apps Median installs: 75 Avg rating: 4.1 | — | — | 89 major competitor apps |
| classic | 70 | 100 | 45 | 74 25,527 competing apps Median installs: 200 Avg rating: 4.3 | — | — | 532 major competitor apps |
| used | 68 | 100 | 48 | 79 50,800 competing apps Median installs: 75 Avg rating: 4.0 | — | — | 303 major competitor apps |
| limited | 72 | 100 | 43 | 70 14,765 competing apps Median installs: 125 Avg rating: 4.1 | — | — | 217 major competitor apps |
| created | 70 | 100 | 45 | 74 26,504 competing apps Median installs: 125 Avg rating: 4.2 | — | — | 158 major competitor apps |
| multiple | 68 | 100 | 49 | 81 68,139 competing apps Median installs: 125 Avg rating: 4.2 | — | — | 682 major competitor apps |
| including | 68 | 100 | 49 | 81 68,338 competing apps Median installs: 150 Avg rating: 4.1 | — | — | 903 major competitor apps |
| score | 71 | 100 | 44 | 73 21,956 competing apps Median installs: 100 Avg rating: 4.2 | — | — | 297 major competitor apps |
| various | 69 | 100 | 47 | 79 48,990 competing apps Median installs: 100 Avg rating: 4.1 | — | — | 387 major competitor apps |
| tree | 71 | 100 | 35 | 57 2,707 competing apps Median installs: 150 Avg rating: 4.2 | — | — | 27 major competitor apps |
| search | 68 | 100 | 49 | 80 56,712 competing apps Median installs: 125 Avg rating: 4.1 | — | — | 630 major competitor apps |
| move | 70 | 100 | 45 | 74 26,889 competing apps Median installs: 125 Avg rating: 4.1 | — | — | 326 major competitor apps |
| understanding | 72 | 100 | 42 | 69 14,163 competing apps Median installs: 75 Avg rating: 4.3 | — | — | 49 major competitor apps |
| technique | 71 | 100 | 35 | 58 2,836 competing apps Median installs: 100 Avg rating: 4.2 | — | — | 13 major competitor apps |
| run | 71 | 100 | 44 | 73 22,315 competing apps Median installs: 150 Avg rating: 4.1 | — | — | 303 major competitor apps |
| create | 66 | 100 | 52 | 85 118,650 competing apps Median installs: 125 Avg rating: 4.2 | — | — | 1,432 major competitor apps |
| good | 70 | 100 | 46 | 75 32,071 competing apps Median installs: 125 Avg rating: 4.2 | — | — | 348 major competitor apps |
| ai | 69 | 100 | 47 | 78 44,935 competing apps Median installs: 125 Avg rating: 4.3 | — | — | 390 major competitor apps |
App Description
Our 2048 is one of its own kind in the market. We leverage multiple algorithms to create an AI for the classic 2048 puzzle game.
* Redefined by AI *
We created an AI that takes advantage of multiple state-of-the-art algorithms, including Monte Carlo Tree Search (MCTS) [a], Expectimax [b], Iterative Deepening Depth-First Search (IDDFS) [c] and Reinforcement Learning [d].
(a) Monte Carlo Tree Search (MCTS) is a heuristic search algorithm introduced in 2006 for computer Go, and has been used in other games like chess, and of course this 2048 game. Monte Carlo Tree Search Algorithm chooses the best possible move from the current state of game's tree (similar to IDDFS).
(b) Expectimax search is a variation of the minimax algorithm, with addition of "chance" nodes in the search tree. This technique is commonly used in games with undeterministic behavior, such as Minesweeper (random mine location), Pacman (random ghost move) and this 2048 game (random tile spawn position and its number value).
(c)Iterative Deepening depth-first search (IDDFS) is a search strategy in which a depth-limited version of DFS is run repeatedly with increasing depth limits. IDDFS is optimal like breadth-first search (BFS), but uses much less memory. This 2048 AI implementation assigns various heuristic scores (or penalties) on multiple features (e.g. empty cell count) to compute the optimal next move.
(d) Reinforcement learning is the training of ML models to yield an action (or decision) in an environment in order to maximize cumulative reward. This 2048 RL implementation has no hard-coded intelligence (i.e. no heuristic score based on human understanding of the game). There is no knowledge about what makes a good move, and the AI agent "figures it out" on its own as we train the model.
References:
[a] https://www.aaai.org/Papers/AIIDE/2008/AIIDE08-036.pdf
[b] http://www.jveness.info/publications/thesis.pdf
[c] https://cse.sc.edu/~MGV/csce580sp15/gradPres/korf_IDAStar_1985.pdf
[d] http://rail.eecs.berkeley.edu/deeprlcourse/static/slides/lec-8.pdf