Has filled their profile page
Kudos! You've been awarded a silver badge for this challenge. Keep up the great work!
Challenge: Snake Species Identification Challenge
Great work! You're one of the top participants in this challenge. Here's a gold badge to celebrate the acheivement.
Challenge: Snake Species Identification Challenge
I used environment.yml file and EfficientNet-pytorch as a pip package.
Reference ( https://github.com/GokulEpiphany/contests-final-code/blob/master/aicrowd-snake-species/inference/environment.yml )
@akash18014 @picekl From what I remember, we have to explicitly set only cuda version 10.0 (Can I have an example of a code which is working to make a submission on gitlab?) . Maybe downgrade cuda toolkit 10.1 to 10.0 and make sure corresponding tensorflow works with 10.0.
This is the only link which helped me downgrade (https://dmitry.ai/t/topic/33)
What specific version of tensorflow are you using?
Could you please confirm the CUDA version of the K80? [Edit:] Found it in the logs, its 10.0 still. Thanks
In your aicrowd.json file, try adding “debug”: True, you will get the logs as to why the submissions are failing
Could you please confirm this? Couple of CLEF challenges end at 23:59 UTC. BirdCLEF ends at 12:00 UTC.
@ValAn Congrats on the win
A relevant discussion.
conda env export --no-build > environment.yml
Also, Inference happens on a K80 (if you enable GPU). Make sure CUDA version is 10.0 and not 10.1
Could you please clarify on the last date for the contest? On the home page it shows “2 days remaining” but Timeline mentions Jan,17,2020.
Wrote a blog post:
Planning to put together a blog post /upload the code to GitHub and will link here.
It was a great experience participating in this challenge.
Evaluation has not started yet then. Evaluations happen in a sequence, there are still two submissions ahead of you.You can see it here
Mine got evaluated a while ago. What is the status of your submission? If the evaluation status is “Evaluation started/pending”, then wait a while, I guess.
@mohanty Evaluator is not picking up any of my submissions. Please look into it. Latest commit : https://gitlab.aicrowd.com/gokuleloop/snake-breed-identification/commit/c2d0dd9f55cd89796de929328b9a6e746941e774
You can use the following way to solve this issue:
verify_images(test_images_path,delete=True) # test_images_path in run.py
This will fetch you 17686 files
You will have to have some logic to generate predicitions for corrupted files
I do it this way (not the best way):
- Use the sample submission file and replace the random given probabilities with predicted probabilities
Do test it locally. TTA takes a long time to complete (~7hrs)
Apologies, Evaluation took a very long time. I assumed it to be stuck .
[Edit: This won’t work]
A suggestion (not sure if it will work):
Did you try cloning the repo and installing (before run.py is invoked in run.sh), make sure git is installed as well
Building torchvision from source (inside run.sh):
git clone https://github.com/pytorch/vision
python setup.py install
Please provide logs for https://gitlab.aicrowd.com/gokuleloop/snake-breed-identification/issues/1 @arjun_nemani
Not sure if this might be the problem. But did you try editing aicrowd.json with your id in
“authors” : [“aicrowd-user”]
Thanks for the reply.
What kind of f1 score is being used for this multiclass classification? f1 micro or macro or weighted?
Apologies for not getting back earlier. Was super busy and travelling.
Email ID is: email@example.com
I’m considering joining a couple of CVPR challenges, not sure. Will definitely let you know and we can possibly team up.
Hope you are doing great.
I have a couple of intern opportunities coming up. Could you please send across your/your friend CV
Any updates on this?
Sounds good to me.I can share the details by Sunday.
Sounds great. I’m currently in India (IST).
Would love to be a part of it. Please let me know what you need.
Please consider this as my final submission:
Feel free to go through my entire commits.
My attempts at reproducing results: https://gitlab.aicrowd.com/gokuleloop/snake-breed-identification/issues
My attempts to find the best scores: https://gitlab.aicrowd.com/gokuleloop/snake-species-identification-challenge/issues
Please consider submission-v15 as my final solution.
Has entire pipeline(Including preprocessing, species classification). With mere hours away from contest closing, I’m glad I was able to reproduce the results using evaluator. +1 for reproducible science
Fixed it … Thanks
I’m getting similar data loader error as well. Could you please share your comments.
DataLoader worker (pid 19) is killed by signal: Bus error.
Just went through the rules… Working on reproducing the results using the solver.
This is the issue,
I’m not able to tag you inside the issue. Only me, mohanty and aicrowd-bot are available for @ mentions
Is it crucial to write the CSV to the given location,
AICROWD_PREDICTIONS_OUTPUT_PATH = os.getenv('AICROWD_PREDICTIONS_OUTPUT_PATH',False)
or we can directly submit as,
predicitions_output_path = ‘predictions-1.csv’
“predictions_output_path” : predictions_output_path
Could you also share the log files for https://gitlab.aicrowd.com/gokuleloop/snake-species-identification-challenge/issues/5 && https://gitlab.aicrowd.com/gokuleloop/snake-species-identification-challenge/issues/4