runtime error

Exit code: 1. Reason: .00/61.3M [00:00<?, ?B/s] tokenizer.json: 100%|██████████| 61.3M/61.3M [00:00<00:00, 113MB/s] special_tokens_map.json: 0%| | 0.00/280 [00:00<?, ?B/s] special_tokens_map.json: 100%|██████████| 280/280 [00:00<00:00, 2.56MB/s] The tokenizer class you load from this checkpoint is not the same type as the class this function is called from. It may result in unexpected tokenization. The tokenizer class you load from this checkpoint is 'XLMRobertaTokenizer'. The class this function is called from is 'CLIPTokenizerFast'. Traceback (most recent call last): File "/app/app.py", line 7, in <module> processor = AutoProcessor.from_pretrained(model_name) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/processing_auto.py", line 313, in from_pretrained return processor_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/processing_utils.py", line 465, in from_pretrained args = cls._get_arguments_from_pretrained(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/processing_utils.py", line 511, in _get_arguments_from_pretrained args.append(attribute_class.from_pretrained(pretrained_model_name_or_path, **kwargs)) File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2086, in from_pretrained return cls._from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py", line 2325, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/clip/tokenization_clip_fast.py", line 93, in __init__ super().__init__( File "/usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_fast.py", line 111, in __init__ fast_tokenizer = TokenizerFast.from_file(fast_tokenizer_file) Exception: data did not match any variant of untagged enum PyPreTokenizerTypeWrapper at line 87 column 3

Container logs:

Fetching error logs...