Skip to main content

Kojima-takeshi188's group workspace

Timestamps visible
2024-05-14 04:40:44
  File "/usr/local/lib/python3.10/dist-packages/wandb/sdk/interface/interface.py", line 724, in publish_output_raw
2024-05-14 04:40:44
    o.timestamp.GetCurrentTime()
2024-05-14 04:40:44
  File "/usr/local/lib/python3.10/dist-packages/google/protobuf/internal/well_known_types.py", line 195, in GetCurrentTime
2024-05-14 04:40:44
    self.FromDatetime(datetime.datetime.utcnow())
2024-05-14 04:40:44
KeyboardInterrupt
2024-05-14 04:40:44
Original exception was:
2024-05-14 04:40:44
Traceback (most recent call last):
2024-05-14 04:40:44
  File "/home/acc12029sa/llm-vip-seminar-vl03/notebooks/llm/train_speech.py", line 517, in <module>
2024-05-14 04:40:44
    train_accelerate_ddp(args.batch_size, args.epochs, args.total_iteration_num, args.learning_rate, args.warmup_steps, args.your_name, args.model_name, args.num_gpus)
2024-05-14 04:40:44
  File "/home/acc12029sa/llm-vip-seminar-vl03/notebooks/llm/train_speech.py", line 205, in train_accelerate_ddp
2024-05-14 04:40:44
    train_dataloader, validation_dataloader, test_dataloader, tokenizer = set_dataloader_tokenizer(batch_size, model_name)
2024-05-14 04:40:44
  File "/home/acc12029sa/llm-vip-seminar-vl03/notebooks/llm/train_speech.py", line 73, in set_dataloader_tokenizer
2024-05-14 04:40:44
    dataset = GPT2Dataset(open(datafile_path).read().splitlines(), tokenizer, max_length=max_length) #768)
2024-05-14 04:40:44
  File "/home/acc12029sa/llm-vip-seminar-vl03/notebooks/llm/train_speech.py", line 58, in __init__
2024-05-14 04:40:44
    self.input_ids.append(torch.tensor(encodings_dict['input_ids']))
2024-05-14 04:40:44
KeyboardInterrupt