line1-classifier-nt-encoding-linear

This model is a fine-tuned version of InstaDeepAI/nucleotide-transformer-500m-human-ref on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3788
  • F1 Score: 0.8515
  • Precision Score: 0.8354
  • Recall Score: 0.8747
  • Tp: 335
  • Tn: 319
  • Fp: 66
  • Fn: 48
  • Line Ratio Reference: 0.4987
  • Line Ratio Predictions: 0.5221

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss F1 Score Precision Score Recall Score Tp Tn Fp Fn Line Ratio Reference Line Ratio Predictions
0.6003 0.0406 100 0.5406 0.7261 0.7084 0.7676 294 264 121 89 0.4987 0.5404
0.5673 0.0812 200 0.6857 0.6227 0.8839 0.3577 137 367 18 246 0.4987 0.2018
0.545 0.1218 300 0.5341 0.7457 0.8556 0.6031 231 346 39 152 0.4987 0.3516
0.4891 0.1623 400 0.4730 0.7744 0.7949 0.7389 283 312 73 100 0.4987 0.4635
0.4933 0.2029 500 0.4979 0.7539 0.7048 0.8851 339 243 142 44 0.4987 0.6263
0.4762 0.2435 600 0.4560 0.7749 0.8487 0.6736 258 339 46 125 0.4987 0.3958
0.4571 0.2841 700 0.4380 0.8007 0.8108 0.7833 300 315 70 83 0.4987 0.4818
0.44 0.3247 800 0.4292 0.8007 0.7919 0.8146 312 303 82 71 0.4987 0.5130
0.4676 0.3653 900 0.4455 0.7907 0.8684 0.6893 264 345 40 119 0.4987 0.3958
0.4327 0.4058 1000 0.4089 0.8081 0.8410 0.7598 291 330 55 92 0.4987 0.4505
0.4356 0.4464 1100 0.4444 0.7848 0.7428 0.8747 335 269 116 48 0.4987 0.5872
0.4432 0.4870 1200 0.4180 0.8059 0.7940 0.8251 316 303 82 67 0.4987 0.5182
0.4272 0.5276 1300 0.3972 0.8149 0.8375 0.7807 299 327 58 84 0.4987 0.4648
0.4174 0.5682 1400 0.4586 0.7955 0.9134 0.6606 253 361 24 130 0.4987 0.3607
0.4161 0.6088 1500 0.3968 0.8110 0.8324 0.7781 298 325 60 85 0.4987 0.4661
0.4282 0.6494 1600 0.3943 0.8268 0.8157 0.8433 323 312 73 60 0.4987 0.5156
0.4256 0.6899 1700 0.3960 0.8136 0.8352 0.7807 299 326 59 84 0.4987 0.4661
0.4018 0.7305 1800 0.3926 0.8307 0.8373 0.8198 314 324 61 69 0.4987 0.4883
0.4195 0.7711 1900 0.3920 0.8294 0.8247 0.8355 320 317 68 63 0.4987 0.5052
0.399 0.8117 2000 0.4236 0.8221 0.7839 0.8903 341 291 94 42 0.4987 0.5664
0.4006 0.8523 2100 0.3959 0.8227 0.8020 0.8564 328 304 81 55 0.4987 0.5326
0.3822 0.8929 2200 0.3932 0.8249 0.8673 0.7676 294 340 45 89 0.4987 0.4414
0.3659 0.9334 2300 0.3931 0.8203 0.8868 0.7363 282 349 36 101 0.4987 0.4141
0.3416 0.9740 2400 0.4383 0.8233 0.7831 0.8956 343 290 95 40 0.4987 0.5703
0.3896 1.0146 2500 0.3925 0.8358 0.8579 0.8042 308 334 51 75 0.4987 0.4674
0.3375 1.0552 2600 0.3863 0.8259 0.8811 0.7546 289 346 39 94 0.4987 0.4271
0.3572 1.0958 2700 0.3672 0.8385 0.8381 0.8381 321 323 62 62 0.4987 0.4987
0.3435 1.1364 2800 0.3661 0.8451 0.8455 0.8433 323 326 59 60 0.4987 0.4974
0.3431 1.1769 2900 0.3935 0.8206 0.8773 0.7467 286 345 40 97 0.4987 0.4245
0.3397 1.2175 3000 0.3727 0.8302 0.8688 0.7781 298 340 45 85 0.4987 0.4466
0.3304 1.2581 3100 0.3882 0.8261 0.8720 0.7650 293 342 43 90 0.4987 0.4375
0.3369 1.2987 3200 0.3750 0.8450 0.8333 0.8616 330 319 66 53 0.4987 0.5156
0.3379 1.3393 3300 0.3646 0.8463 0.8571 0.8303 318 332 53 65 0.4987 0.4831
0.3359 1.3799 3400 0.3653 0.8357 0.8599 0.8016 307 335 50 76 0.4987 0.4648
0.3339 1.4205 3500 0.3798 0.8355 0.8746 0.7833 300 342 43 83 0.4987 0.4466
0.3441 1.4610 3600 0.3712 0.8331 0.8571 0.7990 306 334 51 77 0.4987 0.4648
0.3288 1.5016 3700 0.3672 0.8501 0.8743 0.8172 313 340 45 70 0.4987 0.4661
0.3322 1.5422 3800 0.3575 0.8382 0.8711 0.7937 304 340 45 79 0.4987 0.4544
0.334 1.5828 3900 0.3691 0.8344 0.8122 0.8695 333 308 77 50 0.4987 0.5339
0.3215 1.6234 4000 0.3674 0.8358 0.8173 0.8642 331 311 74 52 0.4987 0.5273
0.3187 1.6640 4100 0.3766 0.8326 0.8829 0.7676 294 346 39 89 0.4987 0.4336
0.3123 1.7045 4200 0.3965 0.8404 0.8919 0.7755 297 349 36 86 0.4987 0.4336
0.3135 1.7451 4300 0.3708 0.8486 0.8847 0.8016 307 345 40 76 0.4987 0.4518
0.3344 1.7857 4400 0.3954 0.8367 0.8 0.8982 344 299 86 39 0.4987 0.5599
0.3208 1.8263 4500 0.3645 0.8380 0.8798 0.7833 300 344 41 83 0.4987 0.4440
0.3234 1.8669 4600 0.3685 0.8445 0.8905 0.7859 301 348 37 82 0.4987 0.4401
0.326 1.9075 4700 0.3506 0.8463 0.8571 0.8303 318 332 53 65 0.4987 0.4831
0.3194 1.9481 4800 0.3582 0.8423 0.8195 0.8773 336 311 74 47 0.4987 0.5339
0.296 1.9886 4900 0.3560 0.8503 0.8490 0.8512 326 327 58 57 0.4987 0.5
0.2809 2.0292 5000 0.3788 0.8515 0.8354 0.8747 335 319 66 48 0.4987 0.5221
0.272 2.0698 5100 0.3511 0.8528 0.8689 0.8303 318 337 48 65 0.4987 0.4766
0.2556 2.1104 5200 0.3718 0.8463 0.8630 0.8225 315 335 50 68 0.4987 0.4753
0.2358 2.1510 5300 0.3761 0.8514 0.8747 0.8198 314 340 45 69 0.4987 0.4674
0.2738 2.1916 5400 0.3681 0.8539 0.8839 0.8146 312 344 41 71 0.4987 0.4596
0.2538 2.2321 5500 0.3959 0.8449 0.8220 0.8799 337 312 73 46 0.4987 0.5339
0.2602 2.2727 5600 0.4058 0.8487 0.8232 0.8877 340 312 73 43 0.4987 0.5378
0.2768 2.3133 5700 0.3697 0.8472 0.8866 0.7963 305 346 39 78 0.4987 0.4479
0.2637 2.3539 5800 0.3622 0.8529 0.8497 0.8564 328 327 58 55 0.4987 0.5026
0.252 2.3945 5900 0.3607 0.8580 0.8703 0.8407 322 337 48 61 0.4987 0.4818
0.2573 2.4351 6000 0.3544 0.8541 0.8613 0.8433 323 333 52 60 0.4987 0.4883
0.2736 2.4756 6100 0.3495 0.8515 0.8645 0.8329 319 335 50 64 0.4987 0.4805
0.2563 2.5162 6200 0.3536 0.8541 0.8672 0.8355 320 336 49 63 0.4987 0.4805
0.2307 2.5568 6300 0.3612 0.8489 0.8638 0.8277 317 335 50 66 0.4987 0.4779
0.2499 2.5974 6400 0.3586 0.8527 0.875 0.8225 315 340 45 68 0.4987 0.4688
0.2552 2.6380 6500 0.3558 0.8541 0.8733 0.8277 317 339 46 66 0.4987 0.4727
0.2503 2.6786 6600 0.3505 0.8528 0.8709 0.8277 317 338 47 66 0.4987 0.4740
0.2646 2.7192 6700 0.3493 0.8592 0.8809 0.8303 318 342 43 65 0.4987 0.4701
0.2485 2.7597 6800 0.3523 0.8554 0.8676 0.8381 321 336 49 62 0.4987 0.4818
0.2556 2.8003 6900 0.3612 0.8501 0.8743 0.8172 313 340 45 70 0.4987 0.4661
0.2297 2.8409 7000 0.3624 0.8567 0.8740 0.8329 319 339 46 64 0.4987 0.4753
0.2595 2.8815 7100 0.3608 0.8528 0.8649 0.8355 320 335 50 63 0.4987 0.4818
0.2523 2.9221 7200 0.3587 0.8554 0.8716 0.8329 319 338 47 64 0.4987 0.4766
0.2647 2.9627 7300 0.3569 0.8541 0.8652 0.8381 321 335 50 62 0.4987 0.4831

Framework versions

  • PEFT 0.18.0
  • Transformers 4.57.3
  • Pytorch 2.9.0+cu126
  • Datasets 4.0.0
  • Tokenizers 0.22.1
Downloads last month
1
Safetensors
Model size
1.25M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for Kpps/line1-classifier-nt-encoding-linear

Adapter
(3)
this model