Adhithi298 commited on
Commit
926daf6
·
verified ·
1 Parent(s): 8fc5632

End of training

Browse files
Files changed (1) hide show
  1. README.md +10 -7
README.md CHANGED
@@ -16,7 +16,7 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [Salesforce/codet5-small](https://huggingface.co/Salesforce/codet5-small) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.0374
20
 
21
  ## Model description
22
 
@@ -41,18 +41,21 @@ The following hyperparameters were used during training:
41
  - seed: 42
42
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
43
  - lr_scheduler_type: linear
44
- - num_epochs: 3
45
  - mixed_precision_training: Native AMP
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:------:|:----:|:---------------:|
51
- | 0.2772 | 0.5510 | 200 | 0.0929 |
52
- | 0.1424 | 1.1019 | 400 | 0.0616 |
53
- | 0.1055 | 1.6529 | 600 | 0.0458 |
54
- | 0.0825 | 2.2039 | 800 | 0.0401 |
55
- | 0.0752 | 2.7548 | 1000 | 0.0374 |
 
 
 
56
 
57
 
58
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [Salesforce/codet5-small](https://huggingface.co/Salesforce/codet5-small) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.0127
20
 
21
  ## Model description
22
 
 
41
  - seed: 42
42
  - optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
43
  - lr_scheduler_type: linear
44
+ - num_epochs: 1
45
  - mixed_precision_training: Native AMP
46
 
47
  ### Training results
48
 
49
  | Training Loss | Epoch | Step | Validation Loss |
50
  |:-------------:|:------:|:----:|:---------------:|
51
+ | 0.0866 | 0.1182 | 1000 | 0.0282 |
52
+ | 0.0455 | 0.2364 | 2000 | 0.0202 |
53
+ | 0.0338 | 0.3545 | 3000 | 0.0169 |
54
+ | 0.0287 | 0.4727 | 4000 | 0.0153 |
55
+ | 0.0241 | 0.5909 | 5000 | 0.0142 |
56
+ | 0.023 | 0.7091 | 6000 | 0.0138 |
57
+ | 0.0202 | 0.8272 | 7000 | 0.0132 |
58
+ | 0.0206 | 0.9454 | 8000 | 0.0127 |
59
 
60
 
61
  ### Framework versions