Loading

AI Blitz #8

F1 Car Detection using Faster RCNN

Object Detection using faster_rcnn_R_101_C4_3x pretrained model

g_mothy

Getting Started Code for F1 Car Detection Challenge on AIcrowd

Download Necessary Packages 📚

In [ ]:
!pip install aicrowd-cli
In [ ]:
# Installing the AIcrowd CLI
!pip install aicrowd-cli

# Installing PyTorch
!pip install pyyaml==5.1
!pip install torch==1.7.1 torchvision==0.8.2
import torch, torchvision
print(torch.__version__, torch.cuda.is_available())
!gcc --version

# Installing Detectron2
import torch
assert torch.__version__.startswith("1.7")   
!pip install detectron2 -f https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.7/index.html
Collecting aicrowd-cli
  Downloading https://files.pythonhosted.org/packages/a5/8a/fca67e8c1cb1501a9653cd653232bf6fdebbb2393e3de861aad3636a1136/aicrowd_cli-0.1.6-py3-none-any.whl (51kB)
     |████████████████████████████████| 61kB 7.7MB/s 
Requirement already satisfied: click<8,>=7.1.2 in /usr/local/lib/python3.7/dist-packages (from aicrowd-cli) (7.1.2)
Collecting requests<3,>=2.25.1
  Downloading https://files.pythonhosted.org/packages/29/c1/24814557f1d22c56d50280771a17307e6bf87b70727d975fd6b2ce6b014a/requests-2.25.1-py2.py3-none-any.whl (61kB)
     |████████████████████████████████| 61kB 6.3MB/s 
Requirement already satisfied: toml<1,>=0.10.2 in /usr/local/lib/python3.7/dist-packages (from aicrowd-cli) (0.10.2)
Collecting tqdm<5,>=4.56.0
  Downloading https://files.pythonhosted.org/packages/72/8a/34efae5cf9924328a8f34eeb2fdaae14c011462d9f0e3fcded48e1266d1c/tqdm-4.60.0-py2.py3-none-any.whl (75kB)
     |████████████████████████████████| 81kB 9.4MB/s 
Collecting requests-toolbelt<1,>=0.9.1
  Downloading https://files.pythonhosted.org/packages/60/ef/7681134338fc097acef8d9b2f8abe0458e4d87559c689a8c306d0957ece5/requests_toolbelt-0.9.1-py2.py3-none-any.whl (54kB)
     |████████████████████████████████| 61kB 8.6MB/s 
Collecting rich<11,>=10.0.0
  Downloading https://files.pythonhosted.org/packages/1a/da/2a1f064dc620ab47f3f826ae085384084b71ea05c8c21d67f1dfc29189ab/rich-10.1.0-py3-none-any.whl (201kB)
     |████████████████████████████████| 204kB 50.4MB/s 
Collecting gitpython<4,>=3.1.12
  Downloading https://files.pythonhosted.org/packages/a6/99/98019716955ba243657daedd1de8f3a88ca1f5b75057c38e959db22fb87b/GitPython-3.1.14-py3-none-any.whl (159kB)
     |████████████████████████████████| 163kB 50.6MB/s 
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.25.1->aicrowd-cli) (1.24.3)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.25.1->aicrowd-cli) (2.10)
Requirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.25.1->aicrowd-cli) (3.0.4)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.25.1->aicrowd-cli) (2020.12.5)
Requirement already satisfied: pygments<3.0.0,>=2.6.0 in /usr/local/lib/python3.7/dist-packages (from rich<11,>=10.0.0->aicrowd-cli) (2.6.1)
Requirement already satisfied: typing-extensions<4.0.0,>=3.7.4 in /usr/local/lib/python3.7/dist-packages (from rich<11,>=10.0.0->aicrowd-cli) (3.7.4.3)
Collecting commonmark<0.10.0,>=0.9.0
  Downloading https://files.pythonhosted.org/packages/b1/92/dfd892312d822f36c55366118b95d914e5f16de11044a27cf10a7d71bbbf/commonmark-0.9.1-py2.py3-none-any.whl (51kB)
     |████████████████████████████████| 51kB 7.1MB/s 
Collecting colorama<0.5.0,>=0.4.0
  Downloading https://files.pythonhosted.org/packages/44/98/5b86278fbbf250d239ae0ecb724f8572af1c91f4a11edf4d36a206189440/colorama-0.4.4-py2.py3-none-any.whl
Collecting gitdb<5,>=4.0.1
  Downloading https://files.pythonhosted.org/packages/ea/e8/f414d1a4f0bbc668ed441f74f44c116d9816833a48bf81d22b697090dba8/gitdb-4.0.7-py3-none-any.whl (63kB)
     |████████████████████████████████| 71kB 9.4MB/s 
Collecting smmap<5,>=3.0.1
  Downloading https://files.pythonhosted.org/packages/68/ee/d540eb5e5996eb81c26ceffac6ee49041d473bc5125f2aa995cf51ec1cf1/smmap-4.0.0-py2.py3-none-any.whl
ERROR: google-colab 1.0.0 has requirement requests~=2.23.0, but you'll have requests 2.25.1 which is incompatible.
ERROR: datascience 0.10.6 has requirement folium==0.2.1, but you'll have folium 0.8.3 which is incompatible.
Installing collected packages: requests, tqdm, requests-toolbelt, commonmark, colorama, rich, smmap, gitdb, gitpython, aicrowd-cli
  Found existing installation: requests 2.23.0
    Uninstalling requests-2.23.0:
      Successfully uninstalled requests-2.23.0
  Found existing installation: tqdm 4.41.1
    Uninstalling tqdm-4.41.1:
      Successfully uninstalled tqdm-4.41.1
Successfully installed aicrowd-cli-0.1.6 colorama-0.4.4 commonmark-0.9.1 gitdb-4.0.7 gitpython-3.1.14 requests-2.25.1 requests-toolbelt-0.9.1 rich-10.1.0 smmap-4.0.0 tqdm-4.60.0
Collecting pyyaml==5.1
  Downloading https://files.pythonhosted.org/packages/9f/2c/9417b5c774792634834e730932745bc09a7d36754ca00acf1ccd1ac2594d/PyYAML-5.1.tar.gz (274kB)
     |████████████████████████████████| 276kB 18.2MB/s 
Building wheels for collected packages: pyyaml
  Building wheel for pyyaml (setup.py) ... done
  Created wheel for pyyaml: filename=PyYAML-5.1-cp37-cp37m-linux_x86_64.whl size=44074 sha256=e4875cc88aa636d9d0142e96c799885e62114afcb54a40785239ca05f25b0e32
  Stored in directory: /root/.cache/pip/wheels/ad/56/bc/1522f864feb2a358ea6f1a92b4798d69ac783a28e80567a18b
Successfully built pyyaml
Installing collected packages: pyyaml
  Found existing installation: PyYAML 3.13
    Uninstalling PyYAML-3.13:
      Successfully uninstalled PyYAML-3.13
Successfully installed pyyaml-5.1
Collecting torch==1.7.1
  Downloading https://files.pythonhosted.org/packages/90/5d/095ddddc91c8a769a68c791c019c5793f9c4456a688ddd235d6670924ecb/torch-1.7.1-cp37-cp37m-manylinux1_x86_64.whl (776.8MB)
     |████████████████████████████████| 776.8MB 23kB/s 
Collecting torchvision==0.8.2
  Downloading https://files.pythonhosted.org/packages/94/df/969e69a94cff1c8911acb0688117f95e1915becc1e01c73e7960a2c76ec8/torchvision-0.8.2-cp37-cp37m-manylinux1_x86_64.whl (12.8MB)
     |████████████████████████████████| 12.8MB 240kB/s 
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.7/dist-packages (from torch==1.7.1) (3.7.4.3)
Requirement already satisfied: numpy in /usr/local/lib/python3.7/dist-packages (from torch==1.7.1) (1.19.5)
Requirement already satisfied: pillow>=4.1.1 in /usr/local/lib/python3.7/dist-packages (from torchvision==0.8.2) (7.1.2)
ERROR: torchtext 0.9.1 has requirement torch==1.8.1, but you'll have torch 1.7.1 which is incompatible.
Installing collected packages: torch, torchvision
  Found existing installation: torch 1.8.1+cu101
    Uninstalling torch-1.8.1+cu101:
      Successfully uninstalled torch-1.8.1+cu101
  Found existing installation: torchvision 0.9.1+cu101
    Uninstalling torchvision-0.9.1+cu101:
      Successfully uninstalled torchvision-0.9.1+cu101
Successfully installed torch-1.7.1 torchvision-0.8.2
1.7.1 True
gcc (Ubuntu 7.5.0-3ubuntu1~18.04) 7.5.0
Copyright (C) 2017 Free Software Foundation, Inc.
This is free software; see the source for copying conditions.  There is NO
warranty; not even for MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.

Looking in links: https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.7/index.html
Collecting detectron2
  Downloading https://dl.fbaipublicfiles.com/detectron2/wheels/cu101/torch1.7/detectron2-0.4%2Bcu101-cp37-cp37m-linux_x86_64.whl (6.0MB)
     |████████████████████████████████| 6.0MB 621kB/s 
Requirement already satisfied: cloudpickle in /usr/local/lib/python3.7/dist-packages (from detectron2) (1.3.0)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.7/dist-packages (from detectron2) (3.2.2)
Requirement already satisfied: tensorboard in /usr/local/lib/python3.7/dist-packages (from detectron2) (2.4.1)
Collecting omegaconf>=2
  Downloading https://files.pythonhosted.org/packages/d0/eb/9d63ce09dd8aa85767c65668d5414958ea29648a0eec80a4a7d311ec2684/omegaconf-2.0.6-py3-none-any.whl
Collecting iopath>=0.1.2
  Downloading https://files.pythonhosted.org/packages/21/d0/22104caed16fa41382702fed959f4a9b088b2f905e7a82e4483180a2ec2a/iopath-0.1.8-py3-none-any.whl
Requirement already satisfied: Pillow>=7.1 in /usr/local/lib/python3.7/dist-packages (from detectron2) (7.1.2)
Requirement already satisfied: pydot in /usr/local/lib/python3.7/dist-packages (from detectron2) (1.3.0)
Collecting fvcore<0.1.4,>=0.1.3
  Downloading https://files.pythonhosted.org/packages/6b/68/2bacb80e13c4084dfc37fec8f17706a1de4c248157561ff33e463399c4f5/fvcore-0.1.3.post20210317.tar.gz (47kB)
     |████████████████████████████████| 51kB 6.7MB/s 
Requirement already satisfied: tqdm>4.29.0 in /usr/local/lib/python3.7/dist-packages (from detectron2) (4.60.0)
Requirement already satisfied: future in /usr/local/lib/python3.7/dist-packages (from detectron2) (0.16.0)
Collecting yacs>=0.1.6
  Downloading https://files.pythonhosted.org/packages/38/4f/fe9a4d472aa867878ce3bb7efb16654c5d63672b86dc0e6e953a67018433/yacs-0.1.8-py3-none-any.whl
Requirement already satisfied: tabulate in /usr/local/lib/python3.7/dist-packages (from detectron2) (0.8.9)
Requirement already satisfied: pycocotools>=2.0.2 in /usr/local/lib/python3.7/dist-packages (from detectron2) (2.0.2)
Requirement already satisfied: termcolor>=1.1 in /usr/local/lib/python3.7/dist-packages (from detectron2) (1.1.0)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.7/dist-packages (from matplotlib->detectron2) (0.10.0)
Requirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib->detectron2) (2.4.7)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib->detectron2) (1.3.1)
Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib->detectron2) (2.8.1)
Requirement already satisfied: numpy>=1.11 in /usr/local/lib/python3.7/dist-packages (from matplotlib->detectron2) (1.19.5)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (1.8.0)
Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (1.0.1)
Requirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (56.1.0)
Requirement already satisfied: protobuf>=3.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (3.12.4)
Requirement already satisfied: wheel>=0.26; python_version >= "3" in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (0.36.2)
Requirement already satisfied: six>=1.10.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (1.15.0)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (3.3.4)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (0.4.4)
Requirement already satisfied: grpcio>=1.24.3 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (1.32.0)
Requirement already satisfied: google-auth<2,>=1.6.3 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (1.28.1)
Requirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (2.25.1)
Requirement already satisfied: absl-py>=0.4 in /usr/local/lib/python3.7/dist-packages (from tensorboard->detectron2) (0.12.0)
Requirement already satisfied: PyYAML>=5.1.* in /usr/local/lib/python3.7/dist-packages (from omegaconf>=2->detectron2) (5.1)
Requirement already satisfied: typing-extensions in /usr/local/lib/python3.7/dist-packages (from omegaconf>=2->detectron2) (3.7.4.3)
Collecting portalocker
  Downloading https://files.pythonhosted.org/packages/68/33/cb524f4de298509927b90aa5ee34767b9a2b93e663cf354b2a3efa2b4acd/portalocker-2.3.0-py2.py3-none-any.whl
Requirement already satisfied: cython>=0.27.3 in /usr/local/lib/python3.7/dist-packages (from pycocotools>=2.0.2->detectron2) (0.29.22)
Requirement already satisfied: importlib-metadata; python_version < "3.8" in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard->detectron2) (3.10.1)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard->detectron2) (1.3.0)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from google-auth<2,>=1.6.3->tensorboard->detectron2) (4.2.1)
Requirement already satisfied: rsa<5,>=3.1.4; python_version >= "3.6" in /usr/local/lib/python3.7/dist-packages (from google-auth<2,>=1.6.3->tensorboard->detectron2) (4.7.2)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<2,>=1.6.3->tensorboard->detectron2) (0.2.8)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (1.24.3)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (2020.12.5)
Requirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (3.0.4)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3,>=2.21.0->tensorboard->detectron2) (2.10)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata; python_version < "3.8"->markdown>=2.6.8->tensorboard->detectron2) (3.4.1)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard->detectron2) (3.1.0)
Requirement already satisfied: pyasn1>=0.1.3 in /usr/local/lib/python3.7/dist-packages (from rsa<5,>=3.1.4; python_version >= "3.6"->google-auth<2,>=1.6.3->tensorboard->detectron2) (0.4.8)
Building wheels for collected packages: fvcore
  Building wheel for fvcore (setup.py) ... done
  Created wheel for fvcore: filename=fvcore-0.1.3.post20210317-cp37-none-any.whl size=58543 sha256=29ac8d7afe96f903bf66c185a33d3ce3c6c6fe6b2ecd15999f4d44055cd59cff
  Stored in directory: /root/.cache/pip/wheels/d2/ee/3a/5c531df777c03d8c67f22c65f97d6f75321087482d05a9b218
Successfully built fvcore
Installing collected packages: omegaconf, portalocker, iopath, yacs, fvcore, detectron2
Successfully installed detectron2-0.4+cu101 fvcore-0.1.3.post20210317 iopath-0.1.8 omegaconf-2.0.6 portalocker-2.3.0 yacs-0.1.8

Download Data ⏬

The first step is to download out train test data. We will be training a model on the train data and make predictions on test data. We submit our predictions.

In [ ]:
API_KEY = "" #Please enter your API Key from [https://www.aicrowd.com/participants/me]
!aicrowd login --api-key $API_KEY
API Key valid
Saved API Key successfully!
In [ ]:
!aicrowd dataset download --challenge f1-car-detection -j 3
sample_submission.csv:   0% 0.00/228k [00:00<?, ?B/s]
sample_submission.csv: 100% 228k/228k [00:00<00:00, 1.04MB/s]
train.csv:   0% 0.00/547k [00:00<?, ?B/s]

train.csv: 100% 547k/547k [00:00<00:00, 1.70MB/s]
val.csv: 100% 52.6k/52.6k [00:00<00:00, 775kB/s]
val.zip:   0% 0.00/13.1M [00:00<?, ?B/s]

train.zip:  26% 33.6M/131M [00:01<00:03, 27.3MB/s]
test.zip: 100% 32.5M/32.5M [00:01<00:00, 18.7MB/s]
val.zip: 100% 13.1M/13.1M [00:00<00:00, 17.7MB/s]


train.zip:  51% 67.1M/131M [00:01<00:01, 35.1MB/s]

train.zip:  77% 101M/131M [00:02<00:00, 37.9MB/s] 

train.zip: 100% 131M/131M [00:03<00:00, 37.4MB/s]

Below, we create a new directory to put our downloaded data! 🏎

We unzip the ZIP files and move the CSVs.

In [ ]:
!rm -rf data
!mkdir data

!unzip train.zip -d data/train > /dev/null

!unzip val.zip -d data/val > /dev/null

!unzip test.zip -d data/test > /dev/null

!mv train.csv data/train.csv
!mv val.csv data/val.csv

Import packages 📦

It's time to import all the packages that we have downloaded and the packages that we will be needing for building our model.

In [ ]:
import torch, torchvision
import detectron2
from detectron2.utils.logger import setup_logger
setup_logger()

from detectron2 import model_zoo
from detectron2.engine import DefaultTrainer, DefaultPredictor
from detectron2.config import get_cfg
from detectron2.utils.visualizer import Visualizer
from detectron2.data import MetadataCatalog, DatasetCatalog
from detectron2.structures import BoxMode

import pandas as pd
import numpy as np
import os
from PIL import Image, ImageDraw
import matplotlib.pyplot as plt
from tqdm.notebook import tqdm
import cv2
import random
from ast import literal_eval

Load Data

  • We use pandas 🐼 library to load our data.
  • Pandas loads the data into dataframes and facilitates us to analyse the data.
  • Learn more about it here 🤓
In [ ]:
data_path = "data"

train_df = pd.read_csv(os.path.join(data_path, "train.csv"))
val_df = pd.read_csv(os.path.join(data_path, "val.csv"))

Visualize the data 👀

Using Pandas and the Matplot Library in Python, we will be viewing the images in our datasets.

In [ ]:
train_df
Out[ ]:
ImageID bboxes
0 0 [85, 174, 87, 161]
1 1 [72, 165, 72, 169]
2 2 [36, 215, 63, 189]
3 3 [52, 202, 69, 207]
4 4 [83, 146, 53, 157]
... ... ...
19995 19995 [113, 193, 77, 188]
19996 19996 [46, 211, 115, 163]
19997 19997 [45, 225, 65, 172]
19998 19998 [52, 149, 59, 153]
19999 19999 [75, 189, 86, 153]

20000 rows × 2 columns

In [ ]:
# Defining a function to take a look at the images
def show_images(images, num = 5):
    
    images_to_show = np.random.choice(images, num)

    for image_id in images_to_show:

        image = Image.open(os.path.join(data_path, f"train/{image_id}.jpg"))
  
        bbox = literal_eval(train_df.loc[train_df['ImageID'] == image_id]['bboxes'].values[0])

        draw = ImageDraw.Draw(image)

        draw.rectangle([bbox[0], bbox[2], bbox[1], bbox[3]], width=1)

        plt.figure(figsize = (15,15))
        plt.imshow(image)
        plt.show()

show_images(train_df['ImageID'].unique(), num = 5)

Creating Dataset 🎈

In the section below, we will be creating the dataset that will be put into our Model for training!

In [ ]:
dict_dataset = []
def get_dataset_dics():

    for index, row in train_df.iterrows():

        image = Image.open(os.path.join(data_path, f"train/{row['ImageID']}.jpg"))
        w, h = image.size
        
        ann_lst = []

        bbox = literal_eval(row['bboxes'])
    
        ann_dict = {'bbox': [bbox[0], bbox[2], bbox[1], bbox[3]],
        'bbox_mode': BoxMode.XYXY_ABS,
        'category_id': 0, #i[1]['category_id'].values[0],
        'iscrowd': 0}
        
        ann_lst.append(ann_dict)

        image_dict = {'annotations': ann_lst,
            'file_name': os.path.join(data_path, f"train/{row['ImageID']}.jpg"),
            'height': h,
            'image_id': row["ImageID"], #i[1]['image_category_id'].values[0],
            'width': w}
          
        dict_dataset.append(image_dict)

    return dict_dataset

dict_dataset = get_dataset_dics()
In [ ]:
d = f"f1_train{np.random.randint(10000)}"
DatasetCatalog.register(d, lambda d=d : get_dataset_dics())
MetadataCatalog.get(d).set(thing_classes=["F1Cars"])
obj_metadata = MetadataCatalog.get(d)
In [ ]:
for i in random.sample(dict_dataset, 3):
    img = cv2.imread(i["file_name"])
    visualizer = Visualizer(img, metadata=obj_metadata, scale=0.5)
    out = visualizer.draw_dataset_dict(i)
    plt.imshow(out.get_image())

Creating the Model 🏎

Now that we have the dataset is ready, it's time to create a model that we will train on our data!

In [ ]:
cfg = get_cfg()
cfg.merge_from_file(model_zoo.get_config_file("COCO-Detection/faster_rcnn_R_101_C4_3x.yaml"))
cfg.DATASETS.TRAIN = (d,)
cfg.DATASETS.TEST = ()
cfg.DATALOADER.NUM_WORKERS = 2
cfg.MODEL.WEIGHTS = model_zoo.get_checkpoint_url("COCO-Detection/faster_rcnn_R_101_C4_3x.yaml")  #faster_rcnn_R_50_DC5_3x
cfg.SOLVER.IMS_PER_BATCH = 2
cfg.SOLVER.BASE_LR = 0.00025
cfg.SOLVER.MAX_ITER = 4000 #2500
cfg.MODEL.ROI_HEADS.BATCH_SIZE_PER_IMAGE = 64  
cfg.MODEL.ROI_HEADS.NUM_CLASSES = 1

os.makedirs(cfg.OUTPUT_DIR, exist_ok=True)
In [ ]:
#!ls /usr/local/lib/python3.7/dist-packages/detectron2/model_zoo/configs/COCO-Detection/

Train the Model 🏃🏽‍♂️

In [ ]:
trainer = DefaultTrainer(cfg) 
trainer.resume_or_load(resume=False)
trainer.train()
[05/09 18:16:48 d2.engine.defaults]: Model:
GeneralizedRCNN(
  (backbone): ResNet(
    (stem): BasicStem(
      (conv1): Conv2d(
        3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3), bias=False
        (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
      )
    )
    (res2): Sequential(
      (0): BottleneckBlock(
        (shortcut): Conv2d(
          64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv1): Conv2d(
          64, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
        (conv2): Conv2d(
          64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
        (conv3): Conv2d(
          64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
      )
      (1): BottleneckBlock(
        (conv1): Conv2d(
          256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
        (conv2): Conv2d(
          64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
        (conv3): Conv2d(
          64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
      )
      (2): BottleneckBlock(
        (conv1): Conv2d(
          256, 64, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
        (conv2): Conv2d(
          64, 64, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=64, eps=1e-05)
        )
        (conv3): Conv2d(
          64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
      )
    )
    (res3): Sequential(
      (0): BottleneckBlock(
        (shortcut): Conv2d(
          256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
        (conv1): Conv2d(
          256, 128, kernel_size=(1, 1), stride=(2, 2), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv2): Conv2d(
          128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv3): Conv2d(
          128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
      )
      (1): BottleneckBlock(
        (conv1): Conv2d(
          512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv2): Conv2d(
          128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv3): Conv2d(
          128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
      )
      (2): BottleneckBlock(
        (conv1): Conv2d(
          512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv2): Conv2d(
          128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv3): Conv2d(
          128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
      )
      (3): BottleneckBlock(
        (conv1): Conv2d(
          512, 128, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv2): Conv2d(
          128, 128, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=128, eps=1e-05)
        )
        (conv3): Conv2d(
          128, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
      )
    )
    (res4): Sequential(
      (0): BottleneckBlock(
        (shortcut): Conv2d(
          512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
        (conv1): Conv2d(
          512, 256, kernel_size=(1, 1), stride=(2, 2), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (1): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (2): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (3): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (4): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (5): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (6): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (7): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (8): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (9): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (10): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (11): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (12): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (13): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (14): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (15): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (16): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (17): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (18): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (19): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (20): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (21): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
      (22): BottleneckBlock(
        (conv1): Conv2d(
          1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv2): Conv2d(
          256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=256, eps=1e-05)
        )
        (conv3): Conv2d(
          256, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=1024, eps=1e-05)
        )
      )
    )
  )
  (proposal_generator): RPN(
    (rpn_head): StandardRPNHead(
      (conv): Conv2d(1024, 1024, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1))
      (objectness_logits): Conv2d(1024, 15, kernel_size=(1, 1), stride=(1, 1))
      (anchor_deltas): Conv2d(1024, 60, kernel_size=(1, 1), stride=(1, 1))
    )
    (anchor_generator): DefaultAnchorGenerator(
      (cell_anchors): BufferList()
    )
  )
  (roi_heads): Res5ROIHeads(
    (pooler): ROIPooler(
      (level_poolers): ModuleList(
        (0): ROIAlign(output_size=(14, 14), spatial_scale=0.0625, sampling_ratio=0, aligned=True)
      )
    )
    (res5): Sequential(
      (0): BottleneckBlock(
        (shortcut): Conv2d(
          1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False
          (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
        )
        (conv1): Conv2d(
          1024, 512, kernel_size=(1, 1), stride=(2, 2), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
        (conv2): Conv2d(
          512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
        (conv3): Conv2d(
          512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
        )
      )
      (1): BottleneckBlock(
        (conv1): Conv2d(
          2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
        (conv2): Conv2d(
          512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
        (conv3): Conv2d(
          512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
        )
      )
      (2): BottleneckBlock(
        (conv1): Conv2d(
          2048, 512, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
        (conv2): Conv2d(
          512, 512, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=512, eps=1e-05)
        )
        (conv3): Conv2d(
          512, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False
          (norm): FrozenBatchNorm2d(num_features=2048, eps=1e-05)
        )
      )
    )
    (box_predictor): FastRCNNOutputLayers(
      (cls_score): Linear(in_features=2048, out_features=2, bias=True)
      (bbox_pred): Linear(in_features=2048, out_features=4, bias=True)
    )
  )
)
[05/09 18:16:53 d2.data.build]: Removed 0 images with no usable annotations. 60000 images left.
[05/09 18:16:56 d2.data.build]: Distribution of instances among all 1 categories:
|  category  | #instances   |
|:----------:|:-------------|
|   F1Cars   | 60000        |
|            |              |
[05/09 18:16:56 d2.data.dataset_mapper]: [DatasetMapper] Augmentations used in training: [ResizeShortestEdge(short_edge_length=(640, 672, 704, 736, 768, 800), max_size=1333, sample_style='choice'), RandomFlip()]
[05/09 18:16:56 d2.data.build]: Using training sampler TrainingSampler
[05/09 18:16:56 d2.data.common]: Serializing 60000 elements to byte tensors and concatenating them all ...
[05/09 18:16:56 d2.data.common]: Serialized dataset takes 12.33 MiB
WARNING [05/09 18:16:56 d2.solver.build]: SOLVER.STEPS contains values larger than SOLVER.MAX_ITER. These values will be ignored.
Skip loading parameter 'roi_heads.box_predictor.cls_score.weight' to the model due to incompatible shapes: (81, 2048) in the checkpoint but (2, 2048) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.cls_score.bias' to the model due to incompatible shapes: (81,) in the checkpoint but (2,) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.bbox_pred.weight' to the model due to incompatible shapes: (320, 2048) in the checkpoint but (4, 2048) in the model! You might want to double check if this is expected.
Skip loading parameter 'roi_heads.box_predictor.bbox_pred.bias' to the model due to incompatible shapes: (320,) in the checkpoint but (4,) in the model! You might want to double check if this is expected.
[05/09 18:16:56 d2.engine.train_loop]: Starting training from iteration 0
[05/09 18:17:06 d2.utils.events]:  eta: 0:30:28  iter: 19  total_loss: 1.639  loss_cls: 0.6138  loss_box_reg: 0.99  loss_rpn_cls: 0.008319  loss_rpn_loc: 0.02781  time: 0.4683  data_time: 0.0180  lr: 4.9953e-06  max_mem: 2833M
[05/09 18:17:16 d2.utils.events]:  eta: 0:30:35  iter: 39  total_loss: 1.609  loss_cls: 0.5801  loss_box_reg: 1.006  loss_rpn_cls: 0.008355  loss_rpn_loc: 0.02146  time: 0.4753  data_time: 0.0053  lr: 9.9902e-06  max_mem: 2833M
[05/09 18:17:25 d2.utils.events]:  eta: 0:30:29  iter: 59  total_loss: 1.581  loss_cls: 0.511  loss_box_reg: 1.007  loss_rpn_cls: 0.008262  loss_rpn_loc: 0.02544  time: 0.4743  data_time: 0.0046  lr: 1.4985e-05  max_mem: 2833M
[05/09 18:17:35 d2.utils.events]:  eta: 0:30:31  iter: 79  total_loss: 1.471  loss_cls: 0.4393  loss_box_reg: 0.9951  loss_rpn_cls: 0.008001  loss_rpn_loc: 0.02152  time: 0.4754  data_time: 0.0050  lr: 1.998e-05  max_mem: 2833M
[05/09 18:17:45 d2.utils.events]:  eta: 0:30:37  iter: 99  total_loss: 1.387  loss_cls: 0.3729  loss_box_reg: 0.956  loss_rpn_cls: 0.008018  loss_rpn_loc: 0.03548  time: 0.4788  data_time: 0.0061  lr: 2.4975e-05  max_mem: 2833M
[05/09 18:17:54 d2.utils.events]:  eta: 0:30:28  iter: 119  total_loss: 1.307  loss_cls: 0.3018  loss_box_reg: 0.9669  loss_rpn_cls: 0.004709  loss_rpn_loc: 0.006505  time: 0.4764  data_time: 0.0055  lr: 2.997e-05  max_mem: 2833M
[05/09 18:18:04 d2.utils.events]:  eta: 0:30:20  iter: 139  total_loss: 1.22  loss_cls: 0.2485  loss_box_reg: 0.9409  loss_rpn_cls: 0.008002  loss_rpn_loc: 0.01882  time: 0.4763  data_time: 0.0057  lr: 3.4965e-05  max_mem: 2833M
[05/09 18:18:13 d2.utils.events]:  eta: 0:30:12  iter: 159  total_loss: 1.114  loss_cls: 0.1901  loss_box_reg: 0.9074  loss_rpn_cls: 0.005702  loss_rpn_loc: 0.02046  time: 0.4769  data_time: 0.0050  lr: 3.996e-05  max_mem: 2833M
[05/09 18:18:23 d2.utils.events]:  eta: 0:30:05  iter: 179  total_loss: 1.065  loss_cls: 0.146  loss_box_reg: 0.8866  loss_rpn_cls: 0.004049  loss_rpn_loc: 0.01627  time: 0.4793  data_time: 0.0050  lr: 4.4955e-05  max_mem: 2833M
[05/09 18:18:33 d2.utils.events]:  eta: 0:30:05  iter: 199  total_loss: 1.036  loss_cls: 0.1273  loss_box_reg: 0.8779  loss_rpn_cls: 0.004466  loss_rpn_loc: 0.02314  time: 0.4811  data_time: 0.0051  lr: 4.995e-05  max_mem: 2833M
[05/09 18:18:43 d2.utils.events]:  eta: 0:30:00  iter: 219  total_loss: 0.9286  loss_cls: 0.08718  loss_box_reg: 0.8196  loss_rpn_cls: 0.001998  loss_rpn_loc: 0.02214  time: 0.4817  data_time: 0.0049  lr: 5.4945e-05  max_mem: 2833M
[05/09 18:18:53 d2.utils.events]:  eta: 0:29:52  iter: 239  total_loss: 0.9104  loss_cls: 0.0768  loss_box_reg: 0.7902  loss_rpn_cls: 0.002318  loss_rpn_loc: 0.02265  time: 0.4814  data_time: 0.0047  lr: 5.994e-05  max_mem: 2833M
[05/09 18:19:03 d2.utils.events]:  eta: 0:29:53  iter: 259  total_loss: 0.8356  loss_cls: 0.07046  loss_box_reg: 0.7268  loss_rpn_cls: 0.002184  loss_rpn_loc: 0.019  time: 0.4835  data_time: 0.0050  lr: 6.4935e-05  max_mem: 2833M
[05/09 18:19:13 d2.utils.events]:  eta: 0:29:51  iter: 279  total_loss: 0.7576  loss_cls: 0.06312  loss_box_reg: 0.6608  loss_rpn_cls: 0.001374  loss_rpn_loc: 0.01606  time: 0.4847  data_time: 0.0060  lr: 6.993e-05  max_mem: 2833M
[05/09 18:19:23 d2.utils.events]:  eta: 0:29:45  iter: 299  total_loss: 0.6438  loss_cls: 0.04504  loss_box_reg: 0.5698  loss_rpn_cls: 0.001202  loss_rpn_loc: 0.02114  time: 0.4867  data_time: 0.0049  lr: 7.4925e-05  max_mem: 2833M
[05/09 18:19:33 d2.utils.events]:  eta: 0:29:43  iter: 319  total_loss: 0.4723  loss_cls: 0.02826  loss_box_reg: 0.4077  loss_rpn_cls: 0.001485  loss_rpn_loc: 0.02326  time: 0.4886  data_time: 0.0052  lr: 7.992e-05  max_mem: 2833M
[05/09 18:19:44 d2.utils.events]:  eta: 0:29:38  iter: 339  total_loss: 0.3759  loss_cls: 0.03441  loss_box_reg: 0.3178  loss_rpn_cls: 0.001615  loss_rpn_loc: 0.02019  time: 0.4898  data_time: 0.0056  lr: 8.4915e-05  max_mem: 2833M
[05/09 18:19:54 d2.utils.events]:  eta: 0:29:29  iter: 359  total_loss: 0.3098  loss_cls: 0.03275  loss_box_reg: 0.2432  loss_rpn_cls: 0.001584  loss_rpn_loc: 0.02476  time: 0.4902  data_time: 0.0053  lr: 8.991e-05  max_mem: 2833M
[05/09 18:20:03 d2.utils.events]:  eta: 0:29:19  iter: 379  total_loss: 0.2541  loss_cls: 0.03289  loss_box_reg: 0.208  loss_rpn_cls: 0.0009749  loss_rpn_loc: 0.01088  time: 0.4905  data_time: 0.0047  lr: 9.4905e-05  max_mem: 2833M
[05/09 18:20:13 d2.utils.events]:  eta: 0:29:14  iter: 399  total_loss: 0.2271  loss_cls: 0.02949  loss_box_reg: 0.1866  loss_rpn_cls: 0.0005846  loss_rpn_loc: 0.01184  time: 0.4902  data_time: 0.0049  lr: 9.99e-05  max_mem: 2833M
[05/09 18:20:23 d2.utils.events]:  eta: 0:29:02  iter: 419  total_loss: 0.2177  loss_cls: 0.02969  loss_box_reg: 0.166  loss_rpn_cls: 0.001499  loss_rpn_loc: 0.0107  time: 0.4903  data_time: 0.0050  lr: 0.0001049  max_mem: 2833M
[05/09 18:20:33 d2.utils.events]:  eta: 0:28:56  iter: 439  total_loss: 0.2252  loss_cls: 0.02815  loss_box_reg: 0.1602  loss_rpn_cls: 0.0006583  loss_rpn_loc: 0.01962  time: 0.4914  data_time: 0.0052  lr: 0.00010989  max_mem: 2833M
[05/09 18:20:43 d2.utils.events]:  eta: 0:28:46  iter: 459  total_loss: 0.2029  loss_cls: 0.02449  loss_box_reg: 0.1404  loss_rpn_cls: 0.000899  loss_rpn_loc: 0.02226  time: 0.4916  data_time: 0.0052  lr: 0.00011489  max_mem: 2833M
[05/09 18:20:53 d2.utils.events]:  eta: 0:28:39  iter: 479  total_loss: 0.2098  loss_cls: 0.01937  loss_box_reg: 0.16  loss_rpn_cls: 0.0006796  loss_rpn_loc: 0.01752  time: 0.4920  data_time: 0.0045  lr: 0.00011988  max_mem: 2833M
[05/09 18:21:03 d2.utils.events]:  eta: 0:28:32  iter: 499  total_loss: 0.1691  loss_cls: 0.0194  loss_box_reg: 0.1257  loss_rpn_cls: 0.0005562  loss_rpn_loc: 0.01411  time: 0.4923  data_time: 0.0057  lr: 0.00012488  max_mem: 2833M
[05/09 18:21:13 d2.utils.events]:  eta: 0:28:24  iter: 519  total_loss: 0.1789  loss_cls: 0.02088  loss_box_reg: 0.1329  loss_rpn_cls: 0.0004473  loss_rpn_loc: 0.01518  time: 0.4929  data_time: 0.0052  lr: 0.00012987  max_mem: 2833M
[05/09 18:21:24 d2.utils.events]:  eta: 0:28:15  iter: 539  total_loss: 0.1771  loss_cls: 0.02373  loss_box_reg: 0.128  loss_rpn_cls: 0.0005175  loss_rpn_loc: 0.01496  time: 0.4932  data_time: 0.0051  lr: 0.00013487  max_mem: 2833M
[05/09 18:21:34 d2.utils.events]:  eta: 0:28:07  iter: 559  total_loss: 0.1596  loss_cls: 0.02533  loss_box_reg: 0.1109  loss_rpn_cls: 0.0006054  loss_rpn_loc: 0.01376  time: 0.4936  data_time: 0.0052  lr: 0.00013986  max_mem: 2833M
[05/09 18:21:44 d2.utils.events]:  eta: 0:27:56  iter: 579  total_loss: 0.1541  loss_cls: 0.02089  loss_box_reg: 0.1127  loss_rpn_cls: 0.0005695  loss_rpn_loc: 0.01308  time: 0.4940  data_time: 0.0049  lr: 0.00014486  max_mem: 2833M
[05/09 18:21:54 d2.utils.events]:  eta: 0:27:46  iter: 599  total_loss: 0.1377  loss_cls: 0.01595  loss_box_reg: 0.1059  loss_rpn_cls: 0.0002703  loss_rpn_loc: 0.01076  time: 0.4938  data_time: 0.0050  lr: 0.00014985  max_mem: 2833M
[05/09 18:22:03 d2.utils.events]:  eta: 0:27:36  iter: 619  total_loss: 0.1448  loss_cls: 0.02111  loss_box_reg: 0.1204  loss_rpn_cls: 0.000695  loss_rpn_loc: 0.01194  time: 0.4939  data_time: 0.0051  lr: 0.00015485  max_mem: 2833M
[05/09 18:22:14 d2.utils.events]:  eta: 0:27:27  iter: 639  total_loss: 0.1592  loss_cls: 0.02233  loss_box_reg: 0.1195  loss_rpn_cls: 0.0006746  loss_rpn_loc: 0.01193  time: 0.4942  data_time: 0.0056  lr: 0.00015984  max_mem: 2833M
[05/09 18:22:24 d2.utils.events]:  eta: 0:27:18  iter: 659  total_loss: 0.14  loss_cls: 0.01881  loss_box_reg: 0.1056  loss_rpn_cls: 0.0002479  loss_rpn_loc: 0.01651  time: 0.4943  data_time: 0.0050  lr: 0.00016484  max_mem: 2833M
[05/09 18:22:34 d2.utils.events]:  eta: 0:27:10  iter: 679  total_loss: 0.1521  loss_cls: 0.02006  loss_box_reg: 0.1105  loss_rpn_cls: 0.0003753  loss_rpn_loc: 0.01207  time: 0.4947  data_time: 0.0051  lr: 0.00016983  max_mem: 2833M
[05/09 18:22:43 d2.utils.events]:  eta: 0:26:58  iter: 699  total_loss: 0.1573  loss_cls: 0.01949  loss_box_reg: 0.1138  loss_rpn_cls: 0.0003669  loss_rpn_loc: 0.01296  time: 0.4943  data_time: 0.0048  lr: 0.00017483  max_mem: 2833M
[05/09 18:22:53 d2.utils.events]:  eta: 0:26:49  iter: 719  total_loss: 0.166  loss_cls: 0.0188  loss_box_reg: 0.1139  loss_rpn_cls: 0.000592  loss_rpn_loc: 0.01266  time: 0.4943  data_time: 0.0047  lr: 0.00017982  max_mem: 2833M
[05/09 18:23:03 d2.utils.events]:  eta: 0:26:39  iter: 739  total_loss: 0.151  loss_cls: 0.01525  loss_box_reg: 0.1038  loss_rpn_cls: 0.001395  loss_rpn_loc: 0.01982  time: 0.4943  data_time: 0.0048  lr: 0.00018482  max_mem: 2833M
[05/09 18:23:13 d2.utils.events]:  eta: 0:26:29  iter: 759  total_loss: 0.1394  loss_cls: 0.01807  loss_box_reg: 0.1043  loss_rpn_cls: 0.0002498  loss_rpn_loc: 0.01132  time: 0.4946  data_time: 0.0056  lr: 0.00018981  max_mem: 2833M
[05/09 18:23:23 d2.utils.events]:  eta: 0:26:20  iter: 779  total_loss: 0.1478  loss_cls: 0.01423  loss_box_reg: 0.1133  loss_rpn_cls: 0.000445  loss_rpn_loc: 0.01627  time: 0.4949  data_time: 0.0055  lr: 0.00019481  max_mem: 2833M
[05/09 18:23:33 d2.utils.events]:  eta: 0:26:13  iter: 799  total_loss: 0.14  loss_cls: 0.01148  loss_box_reg: 0.1142  loss_rpn_cls: 0.0002455  loss_rpn_loc: 0.01522  time: 0.4951  data_time: 0.0047  lr: 0.0001998  max_mem: 2833M
[05/09 18:23:43 d2.utils.events]:  eta: 0:26:04  iter: 819  total_loss: 0.1394  loss_cls: 0.01221  loss_box_reg: 0.111  loss_rpn_cls: 0.0002918  loss_rpn_loc: 0.01112  time: 0.4950  data_time: 0.0047  lr: 0.0002048  max_mem: 2833M
[05/09 18:23:53 d2.utils.events]:  eta: 0:25:54  iter: 839  total_loss: 0.1393  loss_cls: 0.009367  loss_box_reg: 0.1102  loss_rpn_cls: 0.000408  loss_rpn_loc: 0.005633  time: 0.4951  data_time: 0.0047  lr: 0.00020979  max_mem: 2833M
[05/09 18:24:04 d2.utils.events]:  eta: 0:25:45  iter: 859  total_loss: 0.1394  loss_cls: 0.009755  loss_box_reg: 0.1159  loss_rpn_cls: 0.0004067  loss_rpn_loc: 0.009477  time: 0.4955  data_time: 0.0049  lr: 0.00021479  max_mem: 2833M
[05/09 18:24:14 d2.utils.events]:  eta: 0:25:35  iter: 879  total_loss: 0.1266  loss_cls: 0.01337  loss_box_reg: 0.08792  loss_rpn_cls: 0.0002601  loss_rpn_loc: 0.01087  time: 0.4957  data_time: 0.0048  lr: 0.00021978  max_mem: 2833M
[05/09 18:24:24 d2.utils.events]:  eta: 0:25:27  iter: 899  total_loss: 0.1213  loss_cls: 0.01051  loss_box_reg: 0.09574  loss_rpn_cls: 0.0002731  loss_rpn_loc: 0.01199  time: 0.4960  data_time: 0.0052  lr: 0.00022478  max_mem: 2833M
[05/09 18:24:34 d2.utils.events]:  eta: 0:25:18  iter: 919  total_loss: 0.1386  loss_cls: 0.01724  loss_box_reg: 0.1073  loss_rpn_cls: 0.0001537  loss_rpn_loc: 0.01019  time: 0.4962  data_time: 0.0051  lr: 0.00022977  max_mem: 2833M
[05/09 18:24:44 d2.utils.events]:  eta: 0:25:08  iter: 939  total_loss: 0.1274  loss_cls: 0.01306  loss_box_reg: 0.09199  loss_rpn_cls: 0.00027  loss_rpn_loc: 0.01034  time: 0.4961  data_time: 0.0047  lr: 0.00023477  max_mem: 2833M
[05/09 18:24:54 d2.utils.events]:  eta: 0:24:58  iter: 959  total_loss: 0.1511  loss_cls: 0.01776  loss_box_reg: 0.1027  loss_rpn_cls: 0.0004543  loss_rpn_loc: 0.01852  time: 0.4960  data_time: 0.0049  lr: 0.00023976  max_mem: 2833M
[05/09 18:25:04 d2.utils.events]:  eta: 0:24:48  iter: 979  total_loss: 0.1486  loss_cls: 0.01026  loss_box_reg: 0.11  loss_rpn_cls: 0.0002393  loss_rpn_loc: 0.007385  time: 0.4961  data_time: 0.0055  lr: 0.00024476  max_mem: 2833M
[05/09 18:25:14 d2.utils.events]:  eta: 0:24:39  iter: 999  total_loss: 0.138  loss_cls: 0.01578  loss_box_reg: 0.1013  loss_rpn_cls: 0.0002405  loss_rpn_loc: 0.01447  time: 0.4963  data_time: 0.0054  lr: 0.00024975  max_mem: 2833M
[05/09 18:25:24 d2.utils.events]:  eta: 0:24:29  iter: 1019  total_loss: 0.134  loss_cls: 0.009996  loss_box_reg: 0.09897  loss_rpn_cls: 0.0002115  loss_rpn_loc: 0.0123  time: 0.4964  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:25:34 d2.utils.events]:  eta: 0:24:19  iter: 1039  total_loss: 0.1454  loss_cls: 0.01427  loss_box_reg: 0.1069  loss_rpn_cls: 0.0001787  loss_rpn_loc: 0.009115  time: 0.4964  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:25:44 d2.utils.events]:  eta: 0:24:09  iter: 1059  total_loss: 0.1133  loss_cls: 0.008602  loss_box_reg: 0.08605  loss_rpn_cls: 0.000266  loss_rpn_loc: 0.01097  time: 0.4963  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:25:54 d2.utils.events]:  eta: 0:24:02  iter: 1079  total_loss: 0.1255  loss_cls: 0.01409  loss_box_reg: 0.1031  loss_rpn_cls: 0.0001956  loss_rpn_loc: 0.006586  time: 0.4967  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:26:04 d2.utils.events]:  eta: 0:23:52  iter: 1099  total_loss: 0.1237  loss_cls: 0.01455  loss_box_reg: 0.08674  loss_rpn_cls: 0.0001868  loss_rpn_loc: 0.01601  time: 0.4967  data_time: 0.0057  lr: 0.00025  max_mem: 2833M
[05/09 18:26:14 d2.utils.events]:  eta: 0:23:44  iter: 1119  total_loss: 0.1344  loss_cls: 0.0242  loss_box_reg: 0.09844  loss_rpn_cls: 0.0002686  loss_rpn_loc: 0.02182  time: 0.4968  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:26:24 d2.utils.events]:  eta: 0:23:35  iter: 1139  total_loss: 0.1418  loss_cls: 0.01295  loss_box_reg: 0.1091  loss_rpn_cls: 0.0002999  loss_rpn_loc: 0.01662  time: 0.4971  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:26:35 d2.utils.events]:  eta: 0:23:26  iter: 1159  total_loss: 0.1577  loss_cls: 0.01289  loss_box_reg: 0.1245  loss_rpn_cls: 0.0003167  loss_rpn_loc: 0.01265  time: 0.4974  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:26:45 d2.utils.events]:  eta: 0:23:16  iter: 1179  total_loss: 0.1149  loss_cls: 0.01094  loss_box_reg: 0.08472  loss_rpn_cls: 0.0001692  loss_rpn_loc: 0.01383  time: 0.4976  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:26:55 d2.utils.events]:  eta: 0:23:06  iter: 1199  total_loss: 0.1157  loss_cls: 0.01036  loss_box_reg: 0.1005  loss_rpn_cls: 0.0001493  loss_rpn_loc: 0.005171  time: 0.4977  data_time: 0.0064  lr: 0.00025  max_mem: 2833M
[05/09 18:27:05 d2.utils.events]:  eta: 0:22:57  iter: 1219  total_loss: 0.1192  loss_cls: 0.01001  loss_box_reg: 0.08837  loss_rpn_cls: 0.0002031  loss_rpn_loc: 0.01663  time: 0.4979  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:27:15 d2.utils.events]:  eta: 0:22:48  iter: 1239  total_loss: 0.126  loss_cls: 0.01062  loss_box_reg: 0.09429  loss_rpn_cls: 0.0001329  loss_rpn_loc: 0.006919  time: 0.4980  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:27:25 d2.utils.events]:  eta: 0:22:38  iter: 1259  total_loss: 0.1092  loss_cls: 0.006631  loss_box_reg: 0.0821  loss_rpn_cls: 0.0001353  loss_rpn_loc: 0.01364  time: 0.4982  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:27:35 d2.utils.events]:  eta: 0:22:29  iter: 1279  total_loss: 0.1343  loss_cls: 0.01215  loss_box_reg: 0.1066  loss_rpn_cls: 0.0002379  loss_rpn_loc: 0.01025  time: 0.4980  data_time: 0.0046  lr: 0.00025  max_mem: 2833M
[05/09 18:27:45 d2.utils.events]:  eta: 0:22:18  iter: 1299  total_loss: 0.1175  loss_cls: 0.008592  loss_box_reg: 0.0857  loss_rpn_cls: 0.0001799  loss_rpn_loc: 0.01197  time: 0.4982  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:27:55 d2.utils.events]:  eta: 0:22:07  iter: 1319  total_loss: 0.1524  loss_cls: 0.009436  loss_box_reg: 0.1156  loss_rpn_cls: 0.0001715  loss_rpn_loc: 0.01333  time: 0.4982  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:28:05 d2.utils.events]:  eta: 0:21:57  iter: 1339  total_loss: 0.1131  loss_cls: 0.008415  loss_box_reg: 0.08812  loss_rpn_cls: 0.0001796  loss_rpn_loc: 0.005081  time: 0.4981  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:28:15 d2.utils.events]:  eta: 0:21:47  iter: 1359  total_loss: 0.1143  loss_cls: 0.01377  loss_box_reg: 0.09335  loss_rpn_cls: 0.000191  loss_rpn_loc: 0.009199  time: 0.4983  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:28:25 d2.utils.events]:  eta: 0:21:38  iter: 1379  total_loss: 0.107  loss_cls: 0.0117  loss_box_reg: 0.08067  loss_rpn_cls: 0.0001285  loss_rpn_loc: 0.01005  time: 0.4982  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:28:36 d2.utils.events]:  eta: 0:21:29  iter: 1399  total_loss: 0.1281  loss_cls: 0.01643  loss_box_reg: 0.09212  loss_rpn_cls: 0.0001474  loss_rpn_loc: 0.00823  time: 0.4984  data_time: 0.0063  lr: 0.00025  max_mem: 2833M
[05/09 18:28:46 d2.utils.events]:  eta: 0:21:20  iter: 1419  total_loss: 0.1236  loss_cls: 0.008478  loss_box_reg: 0.1055  loss_rpn_cls: 0.0001492  loss_rpn_loc: 0.01147  time: 0.4984  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:28:55 d2.utils.events]:  eta: 0:21:09  iter: 1439  total_loss: 0.1469  loss_cls: 0.009602  loss_box_reg: 0.1063  loss_rpn_cls: 0.0001495  loss_rpn_loc: 0.0102  time: 0.4982  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:29:05 d2.utils.events]:  eta: 0:20:59  iter: 1459  total_loss: 0.115  loss_cls: 0.01448  loss_box_reg: 0.08622  loss_rpn_cls: 0.0001203  loss_rpn_loc: 0.01203  time: 0.4983  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:29:15 d2.utils.events]:  eta: 0:20:49  iter: 1479  total_loss: 0.1035  loss_cls: 0.01198  loss_box_reg: 0.07268  loss_rpn_cls: 0.000212  loss_rpn_loc: 0.0071  time: 0.4983  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:29:25 d2.utils.events]:  eta: 0:20:40  iter: 1499  total_loss: 0.115  loss_cls: 0.01704  loss_box_reg: 0.08501  loss_rpn_cls: 0.0002198  loss_rpn_loc: 0.00911  time: 0.4983  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:29:35 d2.utils.events]:  eta: 0:20:30  iter: 1519  total_loss: 0.1689  loss_cls: 0.009814  loss_box_reg: 0.1335  loss_rpn_cls: 0.0001975  loss_rpn_loc: 0.009407  time: 0.4982  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:29:45 d2.utils.events]:  eta: 0:20:20  iter: 1539  total_loss: 0.1266  loss_cls: 0.0103  loss_box_reg: 0.1012  loss_rpn_cls: 0.0003005  loss_rpn_loc: 0.01022  time: 0.4982  data_time: 0.0058  lr: 0.00025  max_mem: 2833M
[05/09 18:29:55 d2.utils.events]:  eta: 0:20:09  iter: 1559  total_loss: 0.1133  loss_cls: 0.006021  loss_box_reg: 0.08134  loss_rpn_cls: 0.0002405  loss_rpn_loc: 0.01109  time: 0.4982  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:30:05 d2.utils.events]:  eta: 0:20:00  iter: 1579  total_loss: 0.1228  loss_cls: 0.01764  loss_box_reg: 0.08187  loss_rpn_cls: 0.0001937  loss_rpn_loc: 0.01443  time: 0.4983  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:30:15 d2.utils.events]:  eta: 0:19:50  iter: 1599  total_loss: 0.09722  loss_cls: 0.00967  loss_box_reg: 0.07691  loss_rpn_cls: 0.000174  loss_rpn_loc: 0.00616  time: 0.4984  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:30:26 d2.utils.events]:  eta: 0:19:41  iter: 1619  total_loss: 0.1072  loss_cls: 0.01149  loss_box_reg: 0.0771  loss_rpn_cls: 0.0002038  loss_rpn_loc: 0.01264  time: 0.4985  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:30:36 d2.utils.events]:  eta: 0:19:31  iter: 1639  total_loss: 0.121  loss_cls: 0.01078  loss_box_reg: 0.08479  loss_rpn_cls: 0.0001226  loss_rpn_loc: 0.008967  time: 0.4987  data_time: 0.0060  lr: 0.00025  max_mem: 2833M
[05/09 18:30:46 d2.utils.events]:  eta: 0:19:21  iter: 1659  total_loss: 0.1127  loss_cls: 0.01022  loss_box_reg: 0.07963  loss_rpn_cls: 0.0002059  loss_rpn_loc: 0.01104  time: 0.4986  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:30:56 d2.utils.events]:  eta: 0:19:11  iter: 1679  total_loss: 0.1278  loss_cls: 0.009331  loss_box_reg: 0.1018  loss_rpn_cls: 0.0001678  loss_rpn_loc: 0.01639  time: 0.4986  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:31:06 d2.utils.events]:  eta: 0:19:01  iter: 1699  total_loss: 0.1059  loss_cls: 0.01206  loss_box_reg: 0.0689  loss_rpn_cls: 0.0001304  loss_rpn_loc: 0.01341  time: 0.4985  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:31:16 d2.utils.events]:  eta: 0:18:51  iter: 1719  total_loss: 0.1184  loss_cls: 0.008519  loss_box_reg: 0.09105  loss_rpn_cls: 0.0001773  loss_rpn_loc: 0.01431  time: 0.4986  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:31:26 d2.utils.events]:  eta: 0:18:41  iter: 1739  total_loss: 0.1067  loss_cls: 0.009892  loss_box_reg: 0.08308  loss_rpn_cls: 0.0001626  loss_rpn_loc: 0.01075  time: 0.4986  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:31:36 d2.utils.events]:  eta: 0:18:31  iter: 1759  total_loss: 0.1185  loss_cls: 0.01068  loss_box_reg: 0.08463  loss_rpn_cls: 0.0001941  loss_rpn_loc: 0.01492  time: 0.4987  data_time: 0.0066  lr: 0.00025  max_mem: 2833M
[05/09 18:31:46 d2.utils.events]:  eta: 0:18:21  iter: 1779  total_loss: 0.1194  loss_cls: 0.009068  loss_box_reg: 0.08921  loss_rpn_cls: 0.0001487  loss_rpn_loc: 0.01364  time: 0.4988  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:31:56 d2.utils.events]:  eta: 0:18:11  iter: 1799  total_loss: 0.1113  loss_cls: 0.008875  loss_box_reg: 0.08432  loss_rpn_cls: 0.0001835  loss_rpn_loc: 0.01238  time: 0.4988  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:32:06 d2.utils.events]:  eta: 0:18:01  iter: 1819  total_loss: 0.1017  loss_cls: 0.01306  loss_box_reg: 0.07724  loss_rpn_cls: 0.0001066  loss_rpn_loc: 0.008857  time: 0.4988  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:32:16 d2.utils.events]:  eta: 0:17:51  iter: 1839  total_loss: 0.1198  loss_cls: 0.0163  loss_box_reg: 0.07705  loss_rpn_cls: 0.0001397  loss_rpn_loc: 0.009983  time: 0.4988  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:32:26 d2.utils.events]:  eta: 0:17:41  iter: 1859  total_loss: 0.09894  loss_cls: 0.008528  loss_box_reg: 0.07387  loss_rpn_cls: 0.0001389  loss_rpn_loc: 0.01223  time: 0.4989  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:32:36 d2.utils.events]:  eta: 0:17:31  iter: 1879  total_loss: 0.1211  loss_cls: 0.01383  loss_box_reg: 0.08563  loss_rpn_cls: 0.0001936  loss_rpn_loc: 0.01205  time: 0.4988  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:32:46 d2.utils.events]:  eta: 0:17:21  iter: 1899  total_loss: 0.08824  loss_cls: 0.007926  loss_box_reg: 0.07283  loss_rpn_cls: 0.0002425  loss_rpn_loc: 0.01098  time: 0.4987  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:32:56 d2.utils.events]:  eta: 0:17:11  iter: 1919  total_loss: 0.1003  loss_cls: 0.005966  loss_box_reg: 0.07504  loss_rpn_cls: 0.0002691  loss_rpn_loc: 0.01226  time: 0.4987  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:33:06 d2.utils.events]:  eta: 0:17:01  iter: 1939  total_loss: 0.09812  loss_cls: 0.007344  loss_box_reg: 0.07852  loss_rpn_cls: 0.0001484  loss_rpn_loc: 0.007768  time: 0.4986  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:33:16 d2.utils.events]:  eta: 0:16:51  iter: 1959  total_loss: 0.0961  loss_cls: 0.005021  loss_box_reg: 0.07958  loss_rpn_cls: 0.0001016  loss_rpn_loc: 0.01186  time: 0.4986  data_time: 0.0046  lr: 0.00025  max_mem: 2833M
[05/09 18:33:25 d2.utils.events]:  eta: 0:16:41  iter: 1979  total_loss: 0.09396  loss_cls: 0.00424  loss_box_reg: 0.06761  loss_rpn_cls: 0.0001823  loss_rpn_loc: 0.01108  time: 0.4984  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:33:35 d2.utils.events]:  eta: 0:16:30  iter: 1999  total_loss: 0.09309  loss_cls: 0.005837  loss_box_reg: 0.07314  loss_rpn_cls: 0.0001052  loss_rpn_loc: 0.009577  time: 0.4983  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:33:45 d2.utils.events]:  eta: 0:16:20  iter: 2019  total_loss: 0.1142  loss_cls: 0.01047  loss_box_reg: 0.08441  loss_rpn_cls: 0.0002019  loss_rpn_loc: 0.01433  time: 0.4983  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:33:55 d2.utils.events]:  eta: 0:16:10  iter: 2039  total_loss: 0.1026  loss_cls: 0.01173  loss_box_reg: 0.07391  loss_rpn_cls: 0.000149  loss_rpn_loc: 0.009882  time: 0.4983  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:34:05 d2.utils.events]:  eta: 0:16:01  iter: 2059  total_loss: 0.1095  loss_cls: 0.006577  loss_box_reg: 0.09075  loss_rpn_cls: 0.0001835  loss_rpn_loc: 0.005156  time: 0.4983  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:34:15 d2.utils.events]:  eta: 0:15:51  iter: 2079  total_loss: 0.1187  loss_cls: 0.01228  loss_box_reg: 0.07606  loss_rpn_cls: 0.0002951  loss_rpn_loc: 0.01261  time: 0.4984  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:34:25 d2.utils.events]:  eta: 0:15:40  iter: 2099  total_loss: 0.107  loss_cls: 0.00858  loss_box_reg: 0.08281  loss_rpn_cls: 0.0001941  loss_rpn_loc: 0.01256  time: 0.4984  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:34:35 d2.utils.events]:  eta: 0:15:31  iter: 2119  total_loss: 0.1145  loss_cls: 0.008175  loss_box_reg: 0.08444  loss_rpn_cls: 0.0001788  loss_rpn_loc: 0.01217  time: 0.4984  data_time: 0.0046  lr: 0.00025  max_mem: 2833M
[05/09 18:34:45 d2.utils.events]:  eta: 0:15:21  iter: 2139  total_loss: 0.09737  loss_cls: 0.006446  loss_box_reg: 0.07193  loss_rpn_cls: 0.0001236  loss_rpn_loc: 0.009815  time: 0.4984  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:34:55 d2.utils.events]:  eta: 0:15:10  iter: 2159  total_loss: 0.09722  loss_cls: 0.004362  loss_box_reg: 0.06754  loss_rpn_cls: 0.0001501  loss_rpn_loc: 0.01268  time: 0.4983  data_time: 0.0058  lr: 0.00025  max_mem: 2833M
[05/09 18:35:05 d2.utils.events]:  eta: 0:15:00  iter: 2179  total_loss: 0.09628  loss_cls: 0.008024  loss_box_reg: 0.07498  loss_rpn_cls: 0.000132  loss_rpn_loc: 0.01071  time: 0.4983  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:35:15 d2.utils.events]:  eta: 0:14:50  iter: 2199  total_loss: 0.106  loss_cls: 0.006679  loss_box_reg: 0.08196  loss_rpn_cls: 0.0001721  loss_rpn_loc: 0.01347  time: 0.4983  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:35:25 d2.utils.events]:  eta: 0:14:40  iter: 2219  total_loss: 0.1037  loss_cls: 0.006007  loss_box_reg: 0.08344  loss_rpn_cls: 0.0001234  loss_rpn_loc: 0.009099  time: 0.4984  data_time: 0.0045  lr: 0.00025  max_mem: 2833M
[05/09 18:35:35 d2.utils.events]:  eta: 0:14:30  iter: 2239  total_loss: 0.09835  loss_cls: 0.008189  loss_box_reg: 0.07855  loss_rpn_cls: 0.0001519  loss_rpn_loc: 0.008579  time: 0.4984  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:35:45 d2.utils.events]:  eta: 0:14:20  iter: 2259  total_loss: 0.09513  loss_cls: 0.007264  loss_box_reg: 0.07287  loss_rpn_cls: 0.000101  loss_rpn_loc: 0.003937  time: 0.4983  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:35:55 d2.utils.events]:  eta: 0:14:10  iter: 2279  total_loss: 0.0928  loss_cls: 0.007135  loss_box_reg: 0.07083  loss_rpn_cls: 0.0001189  loss_rpn_loc: 0.007158  time: 0.4983  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:36:05 d2.utils.events]:  eta: 0:14:01  iter: 2299  total_loss: 0.09591  loss_cls: 0.008024  loss_box_reg: 0.0723  loss_rpn_cls: 0.000144  loss_rpn_loc: 0.01178  time: 0.4984  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:36:15 d2.utils.events]:  eta: 0:13:51  iter: 2319  total_loss: 0.1001  loss_cls: 0.006749  loss_box_reg: 0.07318  loss_rpn_cls: 0.0002429  loss_rpn_loc: 0.01416  time: 0.4984  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:36:25 d2.utils.events]:  eta: 0:13:41  iter: 2339  total_loss: 0.09511  loss_cls: 0.006832  loss_box_reg: 0.07491  loss_rpn_cls: 0.0002663  loss_rpn_loc: 0.00996  time: 0.4984  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:36:35 d2.utils.events]:  eta: 0:13:31  iter: 2359  total_loss: 0.1043  loss_cls: 0.005765  loss_box_reg: 0.07721  loss_rpn_cls: 0.0001565  loss_rpn_loc: 0.01383  time: 0.4985  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:36:45 d2.utils.events]:  eta: 0:13:21  iter: 2379  total_loss: 0.1011  loss_cls: 0.006519  loss_box_reg: 0.08242  loss_rpn_cls: 0.000203  loss_rpn_loc: 0.005301  time: 0.4984  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:36:55 d2.utils.events]:  eta: 0:13:11  iter: 2399  total_loss: 0.1074  loss_cls: 0.00698  loss_box_reg: 0.08268  loss_rpn_cls: 0.0002442  loss_rpn_loc: 0.01228  time: 0.4985  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:37:05 d2.utils.events]:  eta: 0:13:01  iter: 2419  total_loss: 0.09724  loss_cls: 0.009355  loss_box_reg: 0.07459  loss_rpn_cls: 0.0001485  loss_rpn_loc: 0.006954  time: 0.4985  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:37:15 d2.utils.events]:  eta: 0:12:51  iter: 2439  total_loss: 0.09113  loss_cls: 0.005974  loss_box_reg: 0.07057  loss_rpn_cls: 0.0001402  loss_rpn_loc: 0.005425  time: 0.4984  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:37:25 d2.utils.events]:  eta: 0:12:41  iter: 2459  total_loss: 0.1129  loss_cls: 0.006316  loss_box_reg: 0.0794  loss_rpn_cls: 0.0001637  loss_rpn_loc: 0.02082  time: 0.4984  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:37:35 d2.utils.events]:  eta: 0:12:32  iter: 2479  total_loss: 0.09271  loss_cls: 0.005554  loss_box_reg: 0.07597  loss_rpn_cls: 0.0001663  loss_rpn_loc: 0.007693  time: 0.4984  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:37:45 d2.utils.events]:  eta: 0:12:22  iter: 2499  total_loss: 0.112  loss_cls: 0.006914  loss_box_reg: 0.08064  loss_rpn_cls: 0.0001131  loss_rpn_loc: 0.009886  time: 0.4985  data_time: 0.0056  lr: 0.00025  max_mem: 2833M
[05/09 18:37:55 d2.utils.events]:  eta: 0:12:12  iter: 2519  total_loss: 0.1191  loss_cls: 0.008976  loss_box_reg: 0.09111  loss_rpn_cls: 0.0001997  loss_rpn_loc: 0.009605  time: 0.4986  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:38:05 d2.utils.events]:  eta: 0:12:02  iter: 2539  total_loss: 0.09456  loss_cls: 0.004872  loss_box_reg: 0.07507  loss_rpn_cls: 0.0001125  loss_rpn_loc: 0.005433  time: 0.4986  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:38:15 d2.utils.events]:  eta: 0:11:52  iter: 2559  total_loss: 0.09419  loss_cls: 0.00656  loss_box_reg: 0.07361  loss_rpn_cls: 0.0001245  loss_rpn_loc: 0.00892  time: 0.4986  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:38:25 d2.utils.events]:  eta: 0:11:42  iter: 2579  total_loss: 0.09505  loss_cls: 0.006305  loss_box_reg: 0.07664  loss_rpn_cls: 0.0001454  loss_rpn_loc: 0.007783  time: 0.4986  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:38:35 d2.utils.events]:  eta: 0:11:32  iter: 2599  total_loss: 0.1049  loss_cls: 0.01596  loss_box_reg: 0.07777  loss_rpn_cls: 0.0001712  loss_rpn_loc: 0.004738  time: 0.4986  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:38:45 d2.utils.events]:  eta: 0:11:22  iter: 2619  total_loss: 0.09723  loss_cls: 0.008524  loss_box_reg: 0.07177  loss_rpn_cls: 0.0002502  loss_rpn_loc: 0.00821  time: 0.4986  data_time: 0.0060  lr: 0.00025  max_mem: 2833M
[05/09 18:38:55 d2.utils.events]:  eta: 0:11:12  iter: 2639  total_loss: 0.09322  loss_cls: 0.005745  loss_box_reg: 0.07361  loss_rpn_cls: 0.0001972  loss_rpn_loc: 0.01004  time: 0.4987  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:39:05 d2.utils.events]:  eta: 0:11:02  iter: 2659  total_loss: 0.1022  loss_cls: 0.006477  loss_box_reg: 0.0768  loss_rpn_cls: 0.0001818  loss_rpn_loc: 0.009678  time: 0.4986  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:39:15 d2.utils.events]:  eta: 0:10:52  iter: 2679  total_loss: 0.1028  loss_cls: 0.008263  loss_box_reg: 0.07992  loss_rpn_cls: 0.0002657  loss_rpn_loc: 0.009373  time: 0.4987  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:39:25 d2.utils.events]:  eta: 0:10:42  iter: 2699  total_loss: 0.1094  loss_cls: 0.00906  loss_box_reg: 0.06847  loss_rpn_cls: 0.0003755  loss_rpn_loc: 0.01649  time: 0.4987  data_time: 0.0046  lr: 0.00025  max_mem: 2833M
[05/09 18:39:35 d2.utils.events]:  eta: 0:10:32  iter: 2719  total_loss: 0.1089  loss_cls: 0.01223  loss_box_reg: 0.07712  loss_rpn_cls: 0.0001557  loss_rpn_loc: 0.009823  time: 0.4986  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:39:45 d2.utils.events]:  eta: 0:10:22  iter: 2739  total_loss: 0.1001  loss_cls: 0.008156  loss_box_reg: 0.07739  loss_rpn_cls: 0.000185  loss_rpn_loc: 0.005194  time: 0.4986  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:39:55 d2.utils.events]:  eta: 0:10:12  iter: 2759  total_loss: 0.1056  loss_cls: 0.0102  loss_box_reg: 0.07729  loss_rpn_cls: 0.0001786  loss_rpn_loc: 0.01277  time: 0.4986  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:40:05 d2.utils.events]:  eta: 0:10:02  iter: 2779  total_loss: 0.08867  loss_cls: 0.009304  loss_box_reg: 0.0711  loss_rpn_cls: 0.0001221  loss_rpn_loc: 0.007478  time: 0.4986  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:40:15 d2.utils.events]:  eta: 0:09:52  iter: 2799  total_loss: 0.1019  loss_cls: 0.004716  loss_box_reg: 0.07833  loss_rpn_cls: 0.0001488  loss_rpn_loc: 0.008395  time: 0.4985  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:40:25 d2.utils.events]:  eta: 0:09:42  iter: 2819  total_loss: 0.108  loss_cls: 0.007516  loss_box_reg: 0.08011  loss_rpn_cls: 0.0002667  loss_rpn_loc: 0.009486  time: 0.4985  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:40:35 d2.utils.events]:  eta: 0:09:33  iter: 2839  total_loss: 0.09274  loss_cls: 0.004975  loss_box_reg: 0.0715  loss_rpn_cls: 0.0001199  loss_rpn_loc: 0.008975  time: 0.4986  data_time: 0.0056  lr: 0.00025  max_mem: 2833M
[05/09 18:40:45 d2.utils.events]:  eta: 0:09:23  iter: 2859  total_loss: 0.1093  loss_cls: 0.003813  loss_box_reg: 0.07747  loss_rpn_cls: 0.0002181  loss_rpn_loc: 0.01093  time: 0.4986  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:40:55 d2.utils.events]:  eta: 0:09:13  iter: 2879  total_loss: 0.08627  loss_cls: 0.005022  loss_box_reg: 0.06707  loss_rpn_cls: 0.0001132  loss_rpn_loc: 0.006294  time: 0.4986  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:41:05 d2.utils.events]:  eta: 0:09:03  iter: 2899  total_loss: 0.1098  loss_cls: 0.01151  loss_box_reg: 0.08159  loss_rpn_cls: 0.0001785  loss_rpn_loc: 0.007039  time: 0.4987  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:41:15 d2.utils.events]:  eta: 0:08:53  iter: 2919  total_loss: 0.1011  loss_cls: 0.006602  loss_box_reg: 0.07451  loss_rpn_cls: 0.0001224  loss_rpn_loc: 0.01103  time: 0.4987  data_time: 0.0057  lr: 0.00025  max_mem: 2833M
[05/09 18:41:25 d2.utils.events]:  eta: 0:08:43  iter: 2939  total_loss: 0.09843  loss_cls: 0.005629  loss_box_reg: 0.07541  loss_rpn_cls: 0.0001228  loss_rpn_loc: 0.01007  time: 0.4987  data_time: 0.0060  lr: 0.00025  max_mem: 2833M
[05/09 18:41:35 d2.utils.events]:  eta: 0:08:34  iter: 2959  total_loss: 0.1014  loss_cls: 0.005511  loss_box_reg: 0.0796  loss_rpn_cls: 9.881e-05  loss_rpn_loc: 0.007008  time: 0.4987  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:41:45 d2.utils.events]:  eta: 0:08:24  iter: 2979  total_loss: 0.09689  loss_cls: 0.01227  loss_box_reg: 0.06723  loss_rpn_cls: 0.0001426  loss_rpn_loc: 0.01376  time: 0.4988  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:41:55 d2.utils.events]:  eta: 0:08:14  iter: 2999  total_loss: 0.095  loss_cls: 0.00747  loss_box_reg: 0.07384  loss_rpn_cls: 0.0001307  loss_rpn_loc: 0.01101  time: 0.4987  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:42:05 d2.utils.events]:  eta: 0:08:04  iter: 3019  total_loss: 0.1133  loss_cls: 0.009101  loss_box_reg: 0.08278  loss_rpn_cls: 0.0001294  loss_rpn_loc: 0.01436  time: 0.4987  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:42:15 d2.utils.events]:  eta: 0:07:54  iter: 3039  total_loss: 0.1072  loss_cls: 0.007892  loss_box_reg: 0.08784  loss_rpn_cls: 0.0001051  loss_rpn_loc: 0.006225  time: 0.4987  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:42:25 d2.utils.events]:  eta: 0:07:44  iter: 3059  total_loss: 0.1049  loss_cls: 0.009134  loss_box_reg: 0.07931  loss_rpn_cls: 0.0001051  loss_rpn_loc: 0.006906  time: 0.4988  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:42:35 d2.utils.events]:  eta: 0:07:34  iter: 3079  total_loss: 0.1013  loss_cls: 0.005204  loss_box_reg: 0.08247  loss_rpn_cls: 0.000124  loss_rpn_loc: 0.00863  time: 0.4987  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:42:45 d2.utils.events]:  eta: 0:07:24  iter: 3099  total_loss: 0.09158  loss_cls: 0.0103  loss_box_reg: 0.07112  loss_rpn_cls: 0.0001257  loss_rpn_loc: 0.008528  time: 0.4987  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:42:55 d2.utils.events]:  eta: 0:07:15  iter: 3119  total_loss: 0.102  loss_cls: 0.006676  loss_box_reg: 0.07341  loss_rpn_cls: 0.0002087  loss_rpn_loc: 0.01189  time: 0.4988  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:43:05 d2.utils.events]:  eta: 0:07:05  iter: 3139  total_loss: 0.09708  loss_cls: 0.005319  loss_box_reg: 0.07621  loss_rpn_cls: 0.0001207  loss_rpn_loc: 0.009395  time: 0.4988  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:43:15 d2.utils.events]:  eta: 0:06:55  iter: 3159  total_loss: 0.1045  loss_cls: 0.01252  loss_box_reg: 0.07659  loss_rpn_cls: 0.000176  loss_rpn_loc: 0.01027  time: 0.4988  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:43:26 d2.utils.events]:  eta: 0:06:45  iter: 3179  total_loss: 0.1103  loss_cls: 0.008597  loss_box_reg: 0.08446  loss_rpn_cls: 0.0002418  loss_rpn_loc: 0.009267  time: 0.4989  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:43:36 d2.utils.events]:  eta: 0:06:35  iter: 3199  total_loss: 0.09692  loss_cls: 0.005105  loss_box_reg: 0.07823  loss_rpn_cls: 0.0001526  loss_rpn_loc: 0.009579  time: 0.4990  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:43:46 d2.utils.events]:  eta: 0:06:25  iter: 3219  total_loss: 0.1012  loss_cls: 0.009461  loss_box_reg: 0.08058  loss_rpn_cls: 0.000144  loss_rpn_loc: 0.0129  time: 0.4990  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:43:56 d2.utils.events]:  eta: 0:06:15  iter: 3239  total_loss: 0.09586  loss_cls: 0.00507  loss_box_reg: 0.06815  loss_rpn_cls: 0.0001654  loss_rpn_loc: 0.008524  time: 0.4989  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:44:06 d2.utils.events]:  eta: 0:06:06  iter: 3259  total_loss: 0.09425  loss_cls: 0.005779  loss_box_reg: 0.07068  loss_rpn_cls: 0.0001621  loss_rpn_loc: 0.01463  time: 0.4989  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:44:15 d2.utils.events]:  eta: 0:05:56  iter: 3279  total_loss: 0.1001  loss_cls: 0.006268  loss_box_reg: 0.0753  loss_rpn_cls: 9.968e-05  loss_rpn_loc: 0.006532  time: 0.4988  data_time: 0.0057  lr: 0.00025  max_mem: 2833M
[05/09 18:44:25 d2.utils.events]:  eta: 0:05:46  iter: 3299  total_loss: 0.09154  loss_cls: 0.00523  loss_box_reg: 0.07023  loss_rpn_cls: 0.0001431  loss_rpn_loc: 0.008727  time: 0.4988  data_time: 0.0046  lr: 0.00025  max_mem: 2833M
[05/09 18:44:35 d2.utils.events]:  eta: 0:05:36  iter: 3319  total_loss: 0.1066  loss_cls: 0.01131  loss_box_reg: 0.08283  loss_rpn_cls: 0.0001328  loss_rpn_loc: 0.01131  time: 0.4988  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:44:45 d2.utils.events]:  eta: 0:05:26  iter: 3339  total_loss: 0.09052  loss_cls: 0.007142  loss_box_reg: 0.07573  loss_rpn_cls: 0.0001455  loss_rpn_loc: 0.008821  time: 0.4988  data_time: 0.0063  lr: 0.00025  max_mem: 2833M
[05/09 18:44:55 d2.utils.events]:  eta: 0:05:16  iter: 3359  total_loss: 0.1105  loss_cls: 0.00767  loss_box_reg: 0.07923  loss_rpn_cls: 0.0001673  loss_rpn_loc: 0.01149  time: 0.4988  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:45:06 d2.utils.events]:  eta: 0:05:06  iter: 3379  total_loss: 0.1191  loss_cls: 0.00967  loss_box_reg: 0.07939  loss_rpn_cls: 0.0002081  loss_rpn_loc: 0.01234  time: 0.4989  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:45:16 d2.utils.events]:  eta: 0:04:56  iter: 3399  total_loss: 0.1007  loss_cls: 0.008559  loss_box_reg: 0.07604  loss_rpn_cls: 0.0001755  loss_rpn_loc: 0.01128  time: 0.4989  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:45:25 d2.utils.events]:  eta: 0:04:46  iter: 3419  total_loss: 0.09321  loss_cls: 0.005018  loss_box_reg: 0.07591  loss_rpn_cls: 0.000118  loss_rpn_loc: 0.006049  time: 0.4989  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:45:36 d2.utils.events]:  eta: 0:04:37  iter: 3439  total_loss: 0.09129  loss_cls: 0.006201  loss_box_reg: 0.07029  loss_rpn_cls: 0.0001467  loss_rpn_loc: 0.00907  time: 0.4989  data_time: 0.0065  lr: 0.00025  max_mem: 2833M
[05/09 18:45:46 d2.utils.events]:  eta: 0:04:27  iter: 3459  total_loss: 0.1194  loss_cls: 0.004797  loss_box_reg: 0.09137  loss_rpn_cls: 0.0001651  loss_rpn_loc: 0.01197  time: 0.4989  data_time: 0.0055  lr: 0.00025  max_mem: 2833M
[05/09 18:45:56 d2.utils.events]:  eta: 0:04:17  iter: 3479  total_loss: 0.102  loss_cls: 0.01183  loss_box_reg: 0.0712  loss_rpn_cls: 0.0002049  loss_rpn_loc: 0.01025  time: 0.4990  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:46:06 d2.utils.events]:  eta: 0:04:07  iter: 3499  total_loss: 0.1424  loss_cls: 0.01227  loss_box_reg: 0.1085  loss_rpn_cls: 0.000964  loss_rpn_loc: 0.01751  time: 0.4990  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:46:16 d2.utils.events]:  eta: 0:03:57  iter: 3519  total_loss: 0.1328  loss_cls: 0.01111  loss_box_reg: 0.09568  loss_rpn_cls: 0.000315  loss_rpn_loc: 0.01712  time: 0.4990  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:46:26 d2.utils.events]:  eta: 0:03:47  iter: 3539  total_loss: 0.0978  loss_cls: 0.008472  loss_box_reg: 0.07395  loss_rpn_cls: 0.0002692  loss_rpn_loc: 0.008814  time: 0.4990  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:46:36 d2.utils.events]:  eta: 0:03:37  iter: 3559  total_loss: 0.09801  loss_cls: 0.006644  loss_box_reg: 0.07237  loss_rpn_cls: 0.0001894  loss_rpn_loc: 0.005761  time: 0.4990  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:46:46 d2.utils.events]:  eta: 0:03:27  iter: 3579  total_loss: 0.08798  loss_cls: 0.007989  loss_box_reg: 0.06997  loss_rpn_cls: 0.0003245  loss_rpn_loc: 0.01361  time: 0.4989  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:46:56 d2.utils.events]:  eta: 0:03:17  iter: 3599  total_loss: 0.1129  loss_cls: 0.00707  loss_box_reg: 0.07812  loss_rpn_cls: 0.0002284  loss_rpn_loc: 0.01454  time: 0.4989  data_time: 0.0058  lr: 0.00025  max_mem: 2833M
[05/09 18:47:06 d2.utils.events]:  eta: 0:03:07  iter: 3619  total_loss: 0.1138  loss_cls: 0.01455  loss_box_reg: 0.07582  loss_rpn_cls: 0.0001493  loss_rpn_loc: 0.01231  time: 0.4989  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:47:16 d2.utils.events]:  eta: 0:02:58  iter: 3639  total_loss: 0.1035  loss_cls: 0.009324  loss_box_reg: 0.07187  loss_rpn_cls: 0.0001941  loss_rpn_loc: 0.01259  time: 0.4989  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:47:25 d2.utils.events]:  eta: 0:02:48  iter: 3659  total_loss: 0.1038  loss_cls: 0.00798  loss_box_reg: 0.08769  loss_rpn_cls: 0.0001796  loss_rpn_loc: 0.01212  time: 0.4988  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:47:35 d2.utils.events]:  eta: 0:02:38  iter: 3679  total_loss: 0.09202  loss_cls: 0.005134  loss_box_reg: 0.07478  loss_rpn_cls: 0.0002359  loss_rpn_loc: 0.01128  time: 0.4988  data_time: 0.0059  lr: 0.00025  max_mem: 2833M
[05/09 18:47:45 d2.utils.events]:  eta: 0:02:28  iter: 3699  total_loss: 0.09124  loss_cls: 0.005784  loss_box_reg: 0.07651  loss_rpn_cls: 0.0002195  loss_rpn_loc: 0.006858  time: 0.4987  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:47:55 d2.utils.events]:  eta: 0:02:18  iter: 3719  total_loss: 0.09003  loss_cls: 0.00704  loss_box_reg: 0.0692  loss_rpn_cls: 0.0001684  loss_rpn_loc: 0.009899  time: 0.4987  data_time: 0.0063  lr: 0.00025  max_mem: 2833M
[05/09 18:48:05 d2.utils.events]:  eta: 0:02:08  iter: 3739  total_loss: 0.09811  loss_cls: 0.005086  loss_box_reg: 0.07199  loss_rpn_cls: 0.0001965  loss_rpn_loc: 0.009242  time: 0.4987  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:48:15 d2.utils.events]:  eta: 0:01:58  iter: 3759  total_loss: 0.08991  loss_cls: 0.005057  loss_box_reg: 0.07168  loss_rpn_cls: 0.0001748  loss_rpn_loc: 0.006559  time: 0.4987  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:48:25 d2.utils.events]:  eta: 0:01:48  iter: 3779  total_loss: 0.09774  loss_cls: 0.006546  loss_box_reg: 0.08101  loss_rpn_cls: 0.000208  loss_rpn_loc: 0.008797  time: 0.4987  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:48:35 d2.utils.events]:  eta: 0:01:38  iter: 3799  total_loss: 0.09524  loss_cls: 0.01197  loss_box_reg: 0.07159  loss_rpn_cls: 0.000197  loss_rpn_loc: 0.01081  time: 0.4987  data_time: 0.0048  lr: 0.00025  max_mem: 2833M
[05/09 18:48:45 d2.utils.events]:  eta: 0:01:29  iter: 3819  total_loss: 0.09416  loss_cls: 0.01164  loss_box_reg: 0.06608  loss_rpn_cls: 0.0001841  loss_rpn_loc: 0.01157  time: 0.4987  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:48:55 d2.utils.events]:  eta: 0:01:19  iter: 3839  total_loss: 0.08306  loss_cls: 0.004777  loss_box_reg: 0.06592  loss_rpn_cls: 0.0001601  loss_rpn_loc: 0.006344  time: 0.4987  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:49:05 d2.utils.events]:  eta: 0:01:09  iter: 3859  total_loss: 0.09769  loss_cls: 0.01184  loss_box_reg: 0.0706  loss_rpn_cls: 0.0001924  loss_rpn_loc: 0.01302  time: 0.4987  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:49:15 d2.utils.events]:  eta: 0:00:59  iter: 3879  total_loss: 0.09308  loss_cls: 0.008044  loss_box_reg: 0.08306  loss_rpn_cls: 0.000141  loss_rpn_loc: 0.003762  time: 0.4987  data_time: 0.0051  lr: 0.00025  max_mem: 2833M
[05/09 18:49:25 d2.utils.events]:  eta: 0:00:49  iter: 3899  total_loss: 0.09108  loss_cls: 0.006488  loss_box_reg: 0.06548  loss_rpn_cls: 0.0001622  loss_rpn_loc: 0.01048  time: 0.4987  data_time: 0.0050  lr: 0.00025  max_mem: 2833M
[05/09 18:49:35 d2.utils.events]:  eta: 0:00:39  iter: 3919  total_loss: 0.08433  loss_cls: 0.008604  loss_box_reg: 0.06219  loss_rpn_cls: 0.0001483  loss_rpn_loc: 0.008592  time: 0.4987  data_time: 0.0054  lr: 0.00025  max_mem: 2833M
[05/09 18:49:45 d2.utils.events]:  eta: 0:00:29  iter: 3939  total_loss: 0.08707  loss_cls: 0.007597  loss_box_reg: 0.06388  loss_rpn_cls: 0.0002533  loss_rpn_loc: 0.01617  time: 0.4987  data_time: 0.0052  lr: 0.00025  max_mem: 2833M
[05/09 18:49:55 d2.utils.events]:  eta: 0:00:19  iter: 3959  total_loss: 0.09295  loss_cls: 0.006825  loss_box_reg: 0.06682  loss_rpn_cls: 0.0001669  loss_rpn_loc: 0.009611  time: 0.4987  data_time: 0.0047  lr: 0.00025  max_mem: 2833M
[05/09 18:50:05 d2.utils.events]:  eta: 0:00:09  iter: 3979  total_loss: 0.09519  loss_cls: 0.004226  loss_box_reg: 0.07542  loss_rpn_cls: 0.0001848  loss_rpn_loc: 0.01111  time: 0.4987  data_time: 0.0053  lr: 0.00025  max_mem: 2833M
[05/09 18:50:16 d2.utils.events]:  eta: 0:00:00  iter: 3999  total_loss: 0.09866  loss_cls: 0.005525  loss_box_reg: 0.08306  loss_rpn_cls: 0.0001393  loss_rpn_loc: 0.01004  time: 0.4987  data_time: 0.0049  lr: 0.00025  max_mem: 2833M
[05/09 18:50:17 d2.engine.hooks]: Overall training speed: 3998 iterations in 0:33:13 (0.4987 s / it)
[05/09 18:50:17 d2.engine.hooks]: Total training time: 0:33:18 (0:00:05 on hooks)

Let's take a look at the loss and several other metrics by running the pice of code below:

In [ ]:
%load_ext tensorboard
%tensorboard --logdir output

Testing Phase 😅

We are almost done. We trained and validated on the training data. Now its the time to predict on test set and make a submission.

Loading Pretrained Model

Before we predict on any data, let's quickly reload the model to predict on our dataset.

In [ ]:
cfg.MODEL.WEIGHTS = os.path.join(cfg.OUTPUT_DIR, "model_final.pth")  
cfg.MODEL.ROI_HEADS.SCORE_THRESH_TEST = 0.5
predictor = DefaultPredictor(cfg)

Predict Test Set

Predict on the test set and you are all set to make the submission!

In [ ]:
test_imgs_paths = os.listdir(os.path.join(data_path, "test"))

predictions = {"ImageID":[], "bboxes":[]}

for test_img_path in tqdm(test_imgs_paths):

  img = cv2.imread(os.path.join(data_path, "test", test_img_path))
  h, w, _ = img.shape

  model_predictions = predictor(img)

  bboxes = model_predictions['instances'].pred_boxes.tensor.cpu().numpy().tolist()
  scores = model_predictions['instances'].scores.cpu().numpy().tolist()

  for n, bbox in enumerate(bboxes):

      x0, y0, x1, y1 = bbox

      new_bboxes = [x0, x1, y0, y1] # Bounding boxes with correct format 
      new_bboxes.append(scores[n])

      bboxes[n] = new_bboxes

  image_id = test_img_path.split('.')[0]

  predictions['ImageID'].append(image_id)
  predictions['bboxes'].append(bboxes)

Save the prediction to csv

We have the DataFrame for our predictions are ready for submission. Now, we will make a submission to the AIcrowd platform.

You can directly upload the CSV to the challenge or use the AIcrowd CLI to make a final submission directly from this Colab Notebook.

In [ ]:
submission = pd.DataFrame(predictions)
submission
Out[ ]:
ImageID bboxes
0 2909 [[64.66773986816406, 169.8474578857422, 70.887...
1 2055 [[87.46007537841797, 132.23028564453125, 10.36...
2 3402 [[1.8363573551177979, 244.56100463867188, 57.0...
3 3780 [[44.38177490234375, 149.68325805664062, 48.47...
4 3483 [[62.2752571105957, 178.41297912597656, 110.45...
... ... ...
4995 1388 [[62.7009162902832, 228.2646484375, 63.9399681...
4996 1966 [[52.540164947509766, 142.4906463623047, 89.63...
4997 3026 [[4.417206764221191, 165.02200317382812, 42.46...
4998 4511 [[80.92703247070312, 193.214599609375, 74.2993...
4999 4688 [[52.741756439208984, 217.36578369140625, 67.2...

5000 rows × 2 columns

In [ ]:
submission.to_csv("submission.csv", index=False)

Correction of Coordinates Order

Submitting with a single line of code.

In [ ]:
# Importing Libraries
import pandas as pd
from ast import literal_eval

# Reading the submission file
submission = pd.read_csv("submission.csv")


# Going through each row
for index, row in submission.iterrows():

    # Convert the bounding boxes from string to python list
    bboxes = literal_eval(row["bboxes"])[0]

    # Convert the bounding boxes to the right format
    new_bboxes = [[bboxes[0], bboxes[2], bboxes[1], bboxes[3], bboxes[4]]]

    # Making changes to the row
    submission["bboxes"][index] = new_bboxes

# Saving the changes into a new submission file
submission.to_csv("submission.csv", index=False)
/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:19: SettingWithCopyWarning: 
A value is trying to be set on a copy of a slice from a DataFrame

See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy
In [ ]:
!aicrowd submission create -c f1-car-detection -f submission.csv
submission.csv ━━━━━━━━━━━━━━━━━━ 100.0%529.3/527.7 KB848.7 kB/s0:00:00
                                                 ╭─────────────────────────╮                                                  
                                                 │ Successfully submitted! │                                                  
                                                 ╰─────────────────────────╯                                                  
                                                       Important links                                                        
┌──────────────────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────┐
│  This submission │ https://www.aicrowd.com/challenges/ai-blitz-8/problems/f1-car-detection/submissions/135966              │
│                  │                                                                                                         │
│  All submissions │ https://www.aicrowd.com/challenges/ai-blitz-8/problems/f1-car-detection/submissions?my_submissions=true │
│                  │                                                                                                         │
│      Leaderboard │ https://www.aicrowd.com/challenges/ai-blitz-8/problems/f1-car-detection/leaderboards                    │
│                  │                                                                                                         │
│ Discussion forum │ https://discourse.aicrowd.com/c/ai-blitz-8                                                              │
│                  │                                                                                                         │
│   Challenge page │ https://www.aicrowd.com/challenges/ai-blitz-8/problems/f1-car-detection                                 │
└──────────────────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────┘
{'submission_id': 135966, 'created_at': '2021-05-10T17:00:16.978Z'}
In [ ]:

In [ ]:


Comments

You must login before you can post a comment.

Execute