| [2025-06-30 08:05:48,345][__main__][INFO] - cache_dir: /tmp/ |
| dataset: |
| name: kamel-usp/aes_enem_dataset |
| split: JBCS2025 |
| training_params: |
| seed: 42 |
| num_train_epochs: 20 |
| logging_steps: 100 |
| metric_for_best_model: QWK |
| bf16: true |
| bootstrap: |
| enabled: true |
| n_bootstrap: 10000 |
| bootstrap_seed: 42 |
| metrics: |
| - QWK |
| - Macro_F1 |
| - Weighted_F1 |
| post_training_results: |
| model_path: /workspace/jbcs2025/outputs/2025-03-24/20-42-59 |
| experiments: |
| model: |
| name: microsoft/phi-4 |
| type: phi4_classification_lora |
| num_labels: 6 |
| output_dir: ./results/ |
| logging_dir: ./logs/ |
| best_model_dir: ./results/best_model |
| lora_r: 8 |
| lora_dropout: 0.05 |
| lora_alpha: 16 |
| lora_target_modules: all-linear |
| checkpoint_path: '' |
| tokenizer: |
| name: microsoft/phi-4 |
| dataset: |
| grade_index: 1 |
| use_full_context: true |
| training_params: |
| weight_decay: 0.01 |
| warmup_ratio: 0.1 |
| learning_rate: 5.0e-05 |
| train_batch_size: 1 |
| eval_batch_size: 16 |
| gradient_accumulation_steps: 16 |
| gradient_checkpointing: false |
|
|
| [2025-06-30 08:05:52,309][__main__][INFO] - GPU 0: NVIDIA H200 | TDP ≈ 700 W |
| [2025-06-30 08:05:52,310][__main__][INFO] - Starting the Fine Tuning training process. |
| [2025-06-30 08:05:58,177][transformers.tokenization_utils_base][INFO] - loading file vocab.json from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/vocab.json |
| [2025-06-30 08:05:58,177][transformers.tokenization_utils_base][INFO] - loading file merges.txt from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/merges.txt |
| [2025-06-30 08:05:58,178][transformers.tokenization_utils_base][INFO] - loading file tokenizer.json from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/tokenizer.json |
| [2025-06-30 08:05:58,178][transformers.tokenization_utils_base][INFO] - loading file added_tokens.json from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/added_tokens.json |
| [2025-06-30 08:05:58,178][transformers.tokenization_utils_base][INFO] - loading file special_tokens_map.json from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/special_tokens_map.json |
| [2025-06-30 08:05:58,178][transformers.tokenization_utils_base][INFO] - loading file tokenizer_config.json from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/tokenizer_config.json |
| [2025-06-30 08:05:58,178][transformers.tokenization_utils_base][INFO] - loading file chat_template.jinja from cache at None |
| [2025-06-30 08:05:58,320][__main__][INFO] - Tokenizer function parameters- Padding:longest; Truncation: False; Use Full Context: True |
| [2025-06-30 08:06:00,139][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 08:06:00,140][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "id2label": { |
| "0": "LABEL_0", |
| "1": "LABEL_1", |
| "2": "LABEL_2", |
| "3": "LABEL_3", |
| "4": "LABEL_4", |
| "5": "LABEL_5" |
| }, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "label2id": { |
| "LABEL_0": 0, |
| "LABEL_1": 1, |
| "LABEL_2": 2, |
| "LABEL_3": 3, |
| "LABEL_4": 4, |
| "LABEL_5": 5 |
| }, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 08:06:00,277][transformers.modeling_utils][INFO] - loading weights file model.safetensors from cache at /tmp/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/model.safetensors.index.json |
| [2025-06-30 08:06:00,278][transformers.modeling_utils][INFO] - Will use torch_dtype=torch.bfloat16 as defined in model's config object |
| [2025-06-30 08:06:00,278][transformers.modeling_utils][INFO] - Instantiating Phi3ForSequenceClassification model under default dtype torch.bfloat16. |
| [2025-06-30 08:06:08,653][transformers.modeling_utils][INFO] - Some weights of the model checkpoint at microsoft/phi-4 were not used when initializing Phi3ForSequenceClassification: ['lm_head.weight'] |
| - This IS expected if you are initializing Phi3ForSequenceClassification from the checkpoint of a model trained on another task or with another architecture (e.g. initializing a BertForSequenceClassification model from a BertForPreTraining model). |
| - This IS NOT expected if you are initializing Phi3ForSequenceClassification from the checkpoint of a model that you expect to be exactly identical (initializing a BertForSequenceClassification model from a BertForSequenceClassification model). |
| [2025-06-30 08:06:08,653][transformers.modeling_utils][WARNING] - Some weights of Phi3ForSequenceClassification were not initialized from the model checkpoint at microsoft/phi-4 and are newly initialized: ['score.weight'] |
| You should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference. |
| [2025-06-30 08:06:10,413][__main__][INFO] - Initialized new PEFT model for ce loss |
| [2025-06-30 08:06:10,415][__main__][INFO] - None |
| [2025-06-30 08:06:10,416][transformers.training_args][INFO] - PyTorch: setting up devices |
| [2025-06-30 08:06:10,452][__main__][INFO] - Total steps: 620. Number of warmup steps: 62 |
| [2025-06-30 08:06:10,457][transformers.trainer][INFO] - You have loaded a model on multiple GPUs. `is_model_parallel` attribute will be force-set to `True` to avoid any unexpected behavior such as device placement mismatching. |
| [2025-06-30 08:06:10,481][transformers.trainer][INFO] - Using auto half precision backend |
| [2025-06-30 08:06:10,482][transformers.trainer][WARNING] - No label_names provided for model class `PeftModelForSequenceClassification`. Since `PeftModel` hides base models input arguments, if label_names is not given, label_names can't be set automatically within `Trainer`. Note that empty label_names list will be used instead. |
| [2025-06-30 08:06:10,483][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:06:10,496][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 08:06:10,496][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 08:06:10,496][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 08:06:55,940][transformers.trainer][INFO] - The following columns in the Training set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:06:55,981][transformers.trainer][INFO] - ***** Running training ***** |
| [2025-06-30 08:06:55,981][transformers.trainer][INFO] - Num examples = 500 |
| [2025-06-30 08:06:55,981][transformers.trainer][INFO] - Num Epochs = 20 |
| [2025-06-30 08:06:55,981][transformers.trainer][INFO] - Instantaneous batch size per device = 1 |
| [2025-06-30 08:06:55,981][transformers.trainer][INFO] - Total train batch size (w. parallel, distributed & accumulation) = 16 |
| [2025-06-30 08:06:55,981][transformers.trainer][INFO] - Gradient Accumulation steps = 16 |
| [2025-06-30 08:06:55,981][transformers.trainer][INFO] - Total optimization steps = 640 |
| [2025-06-30 08:06:55,983][transformers.trainer][INFO] - Number of trainable parameters = 27,883,520 |
| [2025-06-30 08:14:02,346][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:14:02,349][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 08:14:02,349][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 08:14:02,349][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 08:14:47,567][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-32 |
| [2025-06-30 08:14:48,049][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 08:14:48,049][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 08:21:55,401][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:21:55,405][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 08:21:55,405][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 08:21:55,405][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 08:22:40,630][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-64 |
| [2025-06-30 08:22:41,335][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 08:22:41,336][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 08:29:48,445][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:29:48,448][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 08:29:48,448][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 08:29:48,448][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 08:30:33,554][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-96 |
| [2025-06-30 08:30:34,024][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 08:30:34,025][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 08:30:34,540][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-32] due to args.save_total_limit |
| [2025-06-30 08:30:34,558][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-64] due to args.save_total_limit |
| [2025-06-30 08:37:41,234][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:37:41,237][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 08:37:41,237][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 08:37:41,237][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 08:38:26,445][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-128 |
| [2025-06-30 08:38:26,905][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 08:38:26,905][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 08:38:27,547][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-96] due to args.save_total_limit |
| [2025-06-30 08:45:34,280][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:45:34,284][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 08:45:34,284][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 08:45:34,284][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 08:46:19,387][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-160 |
| [2025-06-30 08:46:19,854][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 08:46:19,854][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 08:46:20,555][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-128] due to args.save_total_limit |
| [2025-06-30 08:53:27,914][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 08:53:27,917][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 08:53:27,917][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 08:53:27,917][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 08:54:13,046][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-192 |
| [2025-06-30 08:54:13,515][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 08:54:13,515][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 09:01:21,666][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 09:01:21,670][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 09:01:21,670][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 09:01:21,670][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 09:02:06,827][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-224 |
| [2025-06-30 09:02:07,304][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 09:02:07,304][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 09:02:07,852][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-192] due to args.save_total_limit |
| [2025-06-30 09:09:14,419][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 09:09:14,423][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 09:09:14,423][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 09:09:14,423][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 09:09:59,575][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-256 |
| [2025-06-30 09:10:00,040][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 09:10:00,040][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 09:10:00,654][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-224] due to args.save_total_limit |
| [2025-06-30 09:17:07,402][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 09:17:07,405][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 09:17:07,406][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 09:17:07,406][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 09:17:52,535][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-288 |
| [2025-06-30 09:17:53,001][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 09:17:53,002][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 09:17:53,663][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-256] due to args.save_total_limit |
| [2025-06-30 09:25:00,179][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 09:25:00,182][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 09:25:00,183][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 09:25:00,183][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 09:25:45,339][transformers.trainer][INFO] - Saving model checkpoint to /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-320 |
| [2025-06-30 09:25:45,818][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 09:25:45,818][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 09:25:46,565][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-288] due to args.save_total_limit |
| [2025-06-30 09:25:46,592][transformers.trainer][INFO] - |
|
|
| Training completed. Do not forget to share your model on huggingface.co/models =) |
|
|
|
|
| [2025-06-30 09:25:46,592][transformers.trainer][INFO] - Loading best model from /workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-160 (score: 0.5310788282924506). |
| [2025-06-30 09:25:46,784][transformers.trainer][INFO] - Deleting older checkpoint [/workspace/jbcs2025/outputs/2025-06-30/08-05-48/results/checkpoint-320] due to args.save_total_limit |
| [2025-06-30 09:25:46,809][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 09:25:46,815][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 09:25:46,815][transformers.trainer][INFO] - Num examples = 132 |
| [2025-06-30 09:25:46,816][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 09:26:31,982][__main__][INFO] - Training completed successfully. |
| [2025-06-30 09:26:31,982][__main__][INFO] - Running on Test |
| [2025-06-30 09:26:31,982][transformers.trainer][INFO] - The following columns in the Evaluation set don't have a corresponding argument in `PeftModelForSequenceClassification.forward` and have been ignored: id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades. If id_prompt, prompt, id, reference, essay_text, supporting_text, essay_year, grades are not expected by `PeftModelForSequenceClassification.forward`, you can safely ignore this message. |
| [2025-06-30 09:26:31,985][transformers.trainer][INFO] - |
| ***** Running Evaluation ***** |
| [2025-06-30 09:26:31,985][transformers.trainer][INFO] - Num examples = 138 |
| [2025-06-30 09:26:31,986][transformers.trainer][INFO] - Batch size = 16 |
| [2025-06-30 09:27:20,819][__main__][INFO] - Test metrics: {'eval_loss': 1.8242748975753784, 'eval_model_preparation_time': 0.0097, 'eval_accuracy': 0.4420289855072464, 'eval_RMSE': 54.053113975374096, 'eval_QWK': 0.45872245050429594, 'eval_HDIV': 0.02898550724637683, 'eval_Macro_F1': 0.22332451499118164, 'eval_Micro_F1': 0.4420289855072464, 'eval_Weighted_F1': 0.37278582930756843, 'eval_TP_0': 0, 'eval_TN_0': 137, 'eval_FP_0': 0, 'eval_FN_0': 1, 'eval_TP_1': 22, 'eval_TN_1': 83, 'eval_FP_1': 20, 'eval_FN_1': 13, 'eval_TP_2': 0, 'eval_TN_2': 133, 'eval_FP_2': 0, 'eval_FN_2': 5, 'eval_TP_3': 35, 'eval_TN_3': 38, 'eval_FP_3': 49, 'eval_FN_3': 16, 'eval_TP_4': 0, 'eval_TN_4': 112, 'eval_FP_4': 0, 'eval_FN_4': 26, 'eval_TP_5': 4, 'eval_TN_5': 110, 'eval_FP_5': 8, 'eval_FN_5': 16, 'eval_runtime': 48.8241, 'eval_samples_per_second': 2.826, 'eval_steps_per_second': 0.184, 'epoch': 10.0} |
| [2025-06-30 09:27:20,820][transformers.trainer][INFO] - Saving model checkpoint to ./results/best_model |
| [2025-06-30 09:27:21,265][transformers.configuration_utils][INFO] - loading configuration file config.json from cache at /workspace/.hf_home/hub/models--microsoft--phi-4/snapshots/187ef0342fff0eb3333be9f00389385e95ef0b61/config.json |
| [2025-06-30 09:27:21,265][transformers.configuration_utils][INFO] - Model config Phi3Config { |
| "architectures": [ |
| "Phi3ForCausalLM" |
| ], |
| "attention_bias": false, |
| "attention_dropout": 0.0, |
| "bos_token_id": 100257, |
| "embd_pdrop": 0.0, |
| "eos_token_id": 100265, |
| "hidden_act": "silu", |
| "hidden_size": 5120, |
| "initializer_range": 0.02, |
| "intermediate_size": 17920, |
| "max_position_embeddings": 16384, |
| "model_type": "phi3", |
| "num_attention_heads": 40, |
| "num_hidden_layers": 40, |
| "num_key_value_heads": 10, |
| "original_max_position_embeddings": 16384, |
| "pad_token_id": 100349, |
| "partial_rotary_factor": 1.0, |
| "resid_pdrop": 0.0, |
| "rms_norm_eps": 1e-05, |
| "rope_scaling": null, |
| "rope_theta": 250000, |
| "sliding_window": null, |
| "tie_word_embeddings": false, |
| "torch_dtype": "bfloat16", |
| "transformers_version": "4.53.0", |
| "use_cache": true, |
| "vocab_size": 100352 |
| } |
|
|
| [2025-06-30 09:27:21,697][transformers.tokenization_utils_base][INFO] - chat template saved in ./results/best_model/chat_template.jinja |
| [2025-06-30 09:27:21,698][transformers.tokenization_utils_base][INFO] - tokenizer config file saved in ./results/best_model/tokenizer_config.json |
| [2025-06-30 09:27:21,698][transformers.tokenization_utils_base][INFO] - Special tokens file saved in ./results/best_model/special_tokens_map.json |
| [2025-06-30 09:27:21,769][__main__][INFO] - Model and tokenizer saved to ./results/best_model |
| [2025-06-30 09:27:21,789][__main__][INFO] - Fine Tuning Finished. |
| [2025-06-30 09:27:22,297][__main__][INFO] - Total emissions: 0.5175 kg CO2eq |
|
|