Loading
Public Evaluation: Completed Final Evaluation: Completed

Visual Doom AI Competition 2018 - Multiplayer Track (2)

Hidden

ViZDoom Reinforcement Learning

8095
13
0
152

UPDATE: Added results of the final evaluation.

Results

From 33 teams submitted 152 agents, we’ve selected 3 best submissions from 3 different teams according to leaderboard for final evaluation on 10 new unknown maps. We’ve added two best bots from 2017’s edition and the best bot from 2016’s edition.

Winner: bwbell (Ben Bell)

1st Runner-Up: TSAILAB (Tsinghua University & Tencent AI Lab)

2nd Runner-Up: michaelkrax (Michael Krax)

Team Bot Map 1 2 3 4 5 6 7 8 9 10 Total frags
bwbell Marv2in 19 15 53 31 21 51 33 34 32 53 342
TSAILAB AWM 19 21 30 33 39 27 22 12 19 24 246
michaelkrax CVFighter 26 18 21 21 30 40 16 24 9 29 234
Terminators Arnold4 (1st in 2017) 13 -4 16 15 15 9 16 24 8 17 128
TSAIL YanShi (2nd in 2017) 6 8 14 18 11 8 13 12 6 21 117
IntelAct IntelAct (1st in 2016) 2 -2 13 12 11 13 14 6 6 20 95

Visual Doom AI Competition at CIG2018 Multiplayer Track (2)

This competition is run on Computational Intelligence and Games Conference 2018.

The task here is to create an Artificial Intelligence agent that is able to compete with other agents in Doom deathmatches using only data available to regular players without any auxilliary information. The agent has to use the ViZDoom framework to connect to the game.

Track 2 is a full on Doom deathmatch (like in previous years) on unknown maps. Agents will compete in multiplayer games and the best frag collector will emerge victorious.

Singleplayer (track 1) challenge page is available here.

Evaluation criteria

The task is to create bots that fight against each other in a regular deathmatch, where different weapons and items are available. The bots will be ranked by the number of frags, where the number of frags for this competition is defined as: frags = number of killed opponents - number of suicides

5 maps are provided for training and more maps can be found at Doomworld. The final evaluation will take place on several (secret) testing maps.

During the public evaluations phase bots will play multiple matches with different opponents and the leaderboard will be updated accordingly.

During final 2 weeks the leaderboard will be hidden. Multiple matches on various maps will be played similarly to the public evaluation period. Best bots (probably 8 of them) will take part in a final matches that will determine the top ranking. Finalists will most likely be published before publication of final results at CIG

Resources

Repository with a sample (random) submission

https://github.com/crowdAI/vizdoom2018-multiplayer-starter-kit

Contact Us

We strongly encourage you to use the public channels mentioned above for communications between the participants and the organizers. In extreme cases, if there are any queries or comments that you would like to make using a private communication channel, then you can send us an email at :

Prizes

UPDATE

The prizes have been updated, as follows:

Top-3 places: will be awarded 500USD/300USD/200USD from IEEE CIS if eligible. For more information on awarding policy please consult this website

Want to help or support us in any way? Contact us!

Datasets License

Participants