sophia-m commited on
Commit
f0a017e
·
verified ·
1 Parent(s): 38e3f12

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -1
README.md CHANGED
@@ -53,14 +53,16 @@ I also utilized Ultralytics’ heatmap code to display a heatmap on the input vi
53
  ## Confusion Matrix
54
  The final confusion matrix shows that my model was very successful at identifying sea otters across the dataset. There are a portion of mislabels where the model mistook a sea otter for background, but this can be expected with the quality of training images and smaller dataset.
55
  https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/blob/main/confusion_matrix_final.png
 
56
 
57
  ## F1 Score
58
  The final F1 curve shows my model’s high precision and recall across the various confidence levels. The curve had a high peak, signifying a harmonic balance between precision and recall.
59
  https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/blob/main/F1_curve.png
60
- ![F1 graph](https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/resolve/main/F1_curve.png?download=true)
61
 
62
  ## Object Detection Model Output
63
  My final object detection output video was a key metric in assessing the performance of my model. I bounced between looking at the output video, assessing how accurate the bounding boxes and identifications were, and rerunning the model with modified parameters. My final model output was successful at identifying sea otters in both land and water, with minimal misclassifications or missed detections.
 
64
 
65
  ---
66
  # Model Use-case
 
53
  ## Confusion Matrix
54
  The final confusion matrix shows that my model was very successful at identifying sea otters across the dataset. There are a portion of mislabels where the model mistook a sea otter for background, but this can be expected with the quality of training images and smaller dataset.
55
  https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/blob/main/confusion_matrix_final.png
56
+ ![Confusion matrix from final run](https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/resolve/main/confusion_matrix_final.png?download=true)
57
 
58
  ## F1 Score
59
  The final F1 curve shows my model’s high precision and recall across the various confidence levels. The curve had a high peak, signifying a harmonic balance between precision and recall.
60
  https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/blob/main/F1_curve.png
61
+ ![F1 graph from final run](https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/resolve/main/F1_curve.png?download=true)
62
 
63
  ## Object Detection Model Output
64
  My final object detection output video was a key metric in assessing the performance of my model. I bounced between looking at the output video, assessing how accurate the bounding boxes and identifications were, and rerunning the model with modified parameters. My final model output was successful at identifying sea otters in both land and water, with minimal misclassifications or missed detections.
65
+ ![Final object detection video from object detection final run](https://huggingface.co/OceanCV/Southern_Sea_Otter_Tracking/resolve/main/object_detection_final.avi?download=true)
66
 
67
  ---
68
  # Model Use-case