File size: 1,000 Bytes
99110db
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9f45415
 
c22fa7e
99110db
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0ff4b7a
9f45415
c22fa7e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
---
language: en
license: mit
tags:
- bitmar
- multimodal
- babylm
- cross-modal
- no-memory
datasets:
- babylm_multimodal
metrics:
- bleu
- cross_modal_similarity
---

# BitMar 100M Token Model (No Episodic Memory)

This model was trained on exactly 100 million tokens as part of the BabyLM challenge without episodic memory.

## Training Details
- Total tokens: 100,000,000
- Epochs completed: 10
- Tokens processed: 996,822,486
- Cross-modal similarity: 0.3342
- Episodic memory: Disabled

## Model Architecture
- Text encoder: 4 layers, 128 hidden size
- Vision encoder: DiNOv2 features compressed to 128
- Episodic memory: Disabled for comparison study

## Usage
```python
from transformers import AutoModel, AutoTokenizer

model = AutoModel.from_pretrained("estebancarlin/bitmar-no-memory")
tokenizer = AutoTokenizer.from_pretrained("estebancarlin/bitmar-no-memory")
```


## Training Status
- **Status**: Completed
- **Tokens Processed**: 996,822,486
- **Best Cross-modal Similarity**: 0.3342