Ther most convenient way to use VAME is through the VAME pipeline. The pipeline class automates the processes of:
- loading the data
- preprocessing the data
- creating and training the VAME model
- validating the VAME model
- segmenting the behavior into motifs
- clustering the motifs into communities
- visualizing the results
Let's start by importing the necessary libraries:
%load_ext autoreload
%autoreload 2
import pprint
from vame.pipeline import VAMEPipeline
from vame.util.sample_data import download_sample_data
2024-12-31 10:48:29.956 INFO --- [MainThread] vame.model.rnn_vae : 25 : Using CUDA 2024-12-31 10:48:29.957 INFO --- [MainThread] vame.model.rnn_vae : 26 : GPU active: True 2024-12-31 10:48:29.994 INFO --- [MainThread] vame.model.rnn_vae : 27 : GPU used: NVIDIA GeForce RTX 2060 Downloading data from 'https://gin.g-node.org/neuroinformatics/movement-test-data/raw/master/metadata.yaml' to file '/home/luiz/.movement/data/temp_metadata.yaml'. SHA256 hash of downloaded file: 01340c169837a135995ebf6698c3add72e3b4bef5e0bf64150103f2b0e45dd25 Use this value as the 'known_hash' argument of 'pooch.retrieve' to ensure that the file hasn't changed if it is downloaded again in the future. 2024-12-31 10:48:34.330 DEBUG --- [MainThread] movement.sample_data : 66 : Successfully downloaded sample metadata file metadata.yaml from https://gin.g-node.org/neuroinformatics/movement-test-data/raw/master to /home/luiz/.movement/data
Input dataΒΆ
To quickly try VAME, you can download sample data and use it as input. If you want to work with your own data, all you need to do is to provide the paths to the video and pose estimation files as lists of strings.
# You can run VAME with data from different sources:
# "DeepLabCut", "SLEAP" or "LightningPose"
source_software = "DeepLabCut"
# Download sample data
ps = download_sample_data(source_software)
videos = [ps["video"]]
poses_estimations = [ps["poses"]]
pprint.pp(videos)
pprint.pp(poses_estimations)
['/home/luiz/.movement/data/videos/single-mouse_EPM_video.mp4'] ['/home/luiz/.movement/data/poses/DLC_single-mouse_EPM.predictions.csv']
Instantiate the VAME pipelineΒΆ
Now it's time to instantiate the VAME pipeline. Select your working directory, name of your project and extra configuration arguments. The extra configuration arguments are optional and can be used to customize the VAME pipeline.
# Set up your working directory and project name
working_directory = '.'
project_name = 'pipeline_example'
# Customize the configuration for the project
config_kwargs = {
"n_clusters": 15,
"pose_confidence": 0.9,
"max_epochs": 100,
}
# Instantiate the pipeline
# this will create a VAME project and prepare the data
pipeline = VAMEPipeline(
working_directory=working_directory,
project_name=project_name,
videos=videos,
poses_estimations=poses_estimations,
source_software=source_software,
config_kwargs=config_kwargs,
)
2024-12-31 10:48:34.443 INFO --- [MainThread] vame.initialize_project.new : 109 : Created "/mnt/shared_storage/Github/VAME/examples/pipeline_example/data" 2024-12-31 10:48:34.444 INFO --- [MainThread] vame.initialize_project.new : 109 : Created "/mnt/shared_storage/Github/VAME/examples/pipeline_example/data/raw" 2024-12-31 10:48:34.445 INFO --- [MainThread] vame.initialize_project.new : 109 : Created "/mnt/shared_storage/Github/VAME/examples/pipeline_example/data/processed" 2024-12-31 10:48:34.446 INFO --- [MainThread] vame.initialize_project.new : 109 : Created "/mnt/shared_storage/Github/VAME/examples/pipeline_example/results" 2024-12-31 10:48:34.447 INFO --- [MainThread] vame.initialize_project.new : 109 : Created "/mnt/shared_storage/Github/VAME/examples/pipeline_example/model" 2024-12-31 10:48:34.448 INFO --- [MainThread] vame.initialize_project.new : 109 : Created "/mnt/shared_storage/Github/VAME/examples/pipeline_example/model/pretrained_model" 2024-12-31 10:48:34.448 INFO --- [MainThread] vame.initialize_project.new : 109 : Created "/mnt/shared_storage/Github/VAME/examples/pipeline_example/model/evaluate" 2024-12-31 10:48:34.449 INFO --- [MainThread] vame.initialize_project.new : 181 : Copying / linking the video files... 2024-12-31 10:48:34.450 INFO --- [MainThread] vame.initialize_project.new : 188 : Creating symbolic link from /home/luiz/.movement/data/videos/single-mouse_EPM_video.mp4 to /mnt/shared_storage/Github/VAME/examples/pipeline_example/data/raw/single-mouse_EPM_video.mp4 2024-12-31 10:48:34.466 INFO --- [MainThread] vame.initialize_project.new : 194 : Copying pose estimation raw data... 2024-12-31 10:48:34.574 DEBUG --- [MainThread] movement.io.load_poses : 394 : Loaded poses from /home/luiz/.movement/data/poses/DLC_single-mouse_EPM.predictions.csv into a DataFrame. 2024-12-31 10:48:34.576 INFO --- [MainThread] movement.io.load_poses : 401 : Loaded pose tracks from /home/luiz/.movement/data/poses/DLC_single-mouse_EPM.predictions.csv: 2024-12-31 10:48:34.578 INFO --- [MainThread] movement.io.load_poses : 402 : <xarray.Dataset> Size: 4MB Dimensions: (time: 18485, individuals: 1, keypoints: 8, space: 2) Coordinates: * time (time) float64 148kB 0.0 0.03333 0.06667 ... 616.1 616.1 616.1 * individuals (individuals) <U12 48B 'individual_0' * keypoints (keypoints) <U13 416B 'snout' 'left_ear' ... 'tail_end' * space (space) <U1 8B 'x' 'y' Data variables: position (time, individuals, keypoints, space) float64 2MB 508.4 ... ... confidence (time, individuals, keypoints) float64 1MB 0.0002829 ... 0.9978 Attributes: fps: 30.0 time_unit: seconds source_software: DeepLabCut source_file: /home/luiz/.movement/data/poses/DLC_single-mouse_EPM.pr... ds_type: poses 2024-12-31 10:48:34.681 INFO --- [MainThread] vame.initialize_project.new : 245 : A VAME project has been created at /mnt/shared_storage/Github/VAME/examples/pipeline_example
Before running the pipeline, you can check the input datasets:
ds = pipeline.get_raw_datasets()
ds
<xarray.Dataset> Size: 4MB Dimensions: (session: 1, time: 18485, individuals: 1, keypoints: 8, space: 2) Coordinates: * session (session) object 8B 'single-mouse_EPM_video' * time (time) float64 148kB 0.0 0.03333 0.06667 ... 616.1 616.1 616.1 * keypoints (keypoints) object 64B 'snout' 'left_ear' ... 'tail_end' * space (space) object 16B 'x' 'y' * individuals (individuals) object 8B 'individual_0' Data variables: position (session, time, individuals, keypoints, space) float64 2MB 5... confidence (session, time, individuals, keypoints) float64 1MB 0.000282... Attributes: fps: 30.0 time_unit: seconds source_software: DeepLabCut source_file: /home/luiz/.movement/data/poses/DLC_single-mouse_EPM.pr... ds_type: poses video_path: /home/luiz/.movement/data/videos/single-mouse_EPM_video...
Run the pipelineΒΆ
Now you can run the pipeline. At this point, you should pass the names of the pose estimation keypoints to be used for egocentric alignment.
Note: The pipeline will take some time to run, depending on the size of the dataset, number of epochs, and if you are using a GPU or not.
preprocessing_kwargs = {
"centered_reference_keypoint": "snout",
"orientation_reference_keypoint": "tailbase",
}
pipeline.run_pipeline(preprocessing_kwargs=preprocessing_kwargs)
2024-12-31 10:48:34.761 INFO --- [MainThread] vame.preprocessing.preprocessing : 38 : Cleaning low confidence data points... 2024-12-31 10:48:34.762 INFO --- [MainThread] vame.preprocessing.cleaning : 39 : Cleaning low confidence data points. Confidence threshold: 0.9 2024-12-31 10:48:34.762 INFO --- [MainThread] vame.preprocessing.cleaning : 42 : Session: single-mouse_EPM_video 2024-12-31 10:48:34.832 INFO --- [MainThread] vame.preprocessing.preprocessing : 46 : Egocentrically aligning and centering... 2024-12-31 10:48:34.832 INFO --- [MainThread] vame.preprocessing.alignment : 36 : Egocentric alignment with references: snout and tailbase 2024-12-31 10:48:34.833 INFO --- [MainThread] vame.preprocessing.alignment : 43 : Session: single-mouse_EPM_video 2024-12-31 10:48:35.92 INFO --- [MainThread] vame.preprocessing.preprocessing : 56 : Cleaning outliers... 2024-12-31 10:48:35.93 INFO --- [MainThread] vame.preprocessing.cleaning : 114 : Cleaning outliers with Z-score transformation and IQR cutoff. 2024-12-31 10:48:35.94 INFO --- [MainThread] vame.preprocessing.cleaning : 119 : Session: single-mouse_EPM_video 2024-12-31 10:48:35.204 INFO --- [MainThread] vame.preprocessing.preprocessing : 64 : Applying Savitzky-Golay filter... 2024-12-31 10:48:35.205 INFO --- [MainThread] vame.preprocessing.filter : 34 : Applying Savitzky-Golay filter... 2024-12-31 10:48:35.206 INFO --- [MainThread] vame.preprocessing.filter : 41 : Session: single-mouse_EPM_video 2024-12-31 10:48:35.336 INFO --- [MainThread] vame.model.create_training : 286 : Creating training dataset... 2024-12-31 10:48:35.337 INFO --- [MainThread] vame.model.create_training : 289 : Creating trainset from the vame.egocentrical_alignment() output 2024-12-31 10:48:35.392 INFO --- [MainThread] vame.model.create_training : 71 : Lenght of train data: 16637 2024-12-31 10:48:35.393 INFO --- [MainThread] vame.model.create_training : 72 : Lenght of test data: 1848 2024-12-31 10:48:35.395 INFO --- [MainThread] vame.model.create_training : 307 : A training and test set has been created. Next step: vame.train_model() 2024-12-31 10:48:35.399 INFO --- [MainThread] vame.model.rnn_vae : 525 : Train Variational Autoencoder - model name: VAME 2024-12-31 10:48:35.401 INFO --- [MainThread] vame.model.rnn_vae : 534 : Using CUDA 2024-12-31 10:48:35.402 INFO --- [MainThread] vame.model.rnn_vae : 535 : GPU active: True 2024-12-31 10:48:35.402 INFO --- [MainThread] vame.model.rnn_vae : 536 : GPU used: NVIDIA GeForce RTX 2060 2024-12-31 10:48:35.403 INFO --- [MainThread] vame.model.rnn_vae : 584 : Latent Dimensions: 30, Time window: 30, Batch Size: 256, Beta: 1, lr: 0.0005 2024-12-31 10:48:35.534 INFO --- [MainThread] vame.model.dataloader : 52 : Compute mean and std for temporal dataset. 2024-12-31 10:48:35.536 INFO --- [MainThread] vame.model.dataloader : 62 : Initialize train data. Datapoints 16637 2024-12-31 10:48:35.539 INFO --- [MainThread] vame.model.dataloader : 64 : Initialize test data. Datapoints 1848 2024-12-31 10:48:36.86 INFO --- [MainThread] vame.model.rnn_vae : 696 : Scheduler step size: 100, Scheduler gamma: 0.20 /home/luiz/anaconda3/envs/vame/lib/python3.11/site-packages/torch/optim/lr_scheduler.py:28: UserWarning: The verbose parameter is deprecated. Please use get_last_lr() to access the learning rate. warnings.warn("The verbose parameter is deprecated. Please use get_last_lr() " 2024-12-31 10:48:36.87 INFO --- [MainThread] vame.model.rnn_vae : 712 : Start training... Training Model: 0%| | 0/99 [00:00<?, ?epoch/s]2024-12-31 10:48:40.491 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 84292.993, MSE-Loss: 51861.721, MSE-Future-Loss 32431.272, KL-Loss: 0.000, Kmeans-Loss: 0.000, weight: 0.00 2024-12-31 10:48:40.713 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 23826.222, MSE-Loss: 23826.222, KL-Loss: 0.000, Kmeans-Loss: 0.000 2024-12-31 10:48:40.719 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 1%|β | 1/99 [00:04<07:33, 4.63s/epoch]2024-12-31 10:48:44.817 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 47643.152, MSE-Loss: 26930.282, MSE-Future-Loss 20712.871, KL-Loss: 0.000, Kmeans-Loss: 0.000, weight: 0.00 2024-12-31 10:48:45.40 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 20163.279, MSE-Loss: 20163.279, KL-Loss: 0.000, Kmeans-Loss: 0.000 2024-12-31 10:48:45.44 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 2%|β | 2/99 [00:08<07:11, 4.45s/epoch]2024-12-31 10:48:49.176 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 38666.907, MSE-Loss: 20282.347, MSE-Future-Loss 18382.846, KL-Loss: 1.080, Kmeans-Loss: 0.633, weight: 0.25 2024-12-31 10:48:49.398 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 18097.325, MSE-Loss: 18095.524, KL-Loss: 1.134, Kmeans-Loss: 0.667 2024-12-31 10:48:49.402 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 3%|β | 3/99 [00:13<07:03, 4.41s/epoch]2024-12-31 10:48:53.549 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 33451.288, MSE-Loss: 16720.134, MSE-Future-Loss 16727.263, KL-Loss: 2.467, Kmeans-Loss: 1.424, weight: 0.50 2024-12-31 10:48:53.775 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 15145.988, MSE-Loss: 15141.998, KL-Loss: 2.533, Kmeans-Loss: 1.457 2024-12-31 10:48:53.779 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 4%|β | 4/99 [00:17<06:57, 4.40s/epoch]2024-12-31 10:48:57.904 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 29592.189, MSE-Loss: 13839.783, MSE-Future-Loss 15746.062, KL-Loss: 4.064, Kmeans-Loss: 2.280, weight: 0.75 2024-12-31 10:48:58.135 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 15314.163, MSE-Loss: 15307.882, KL-Loss: 3.987, Kmeans-Loss: 2.293 2024-12-31 10:48:58.139 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 5%|ββ | 5/99 [00:22<06:51, 4.38s/epoch]2024-12-31 10:49:02.269 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 26908.692, MSE-Loss: 11862.458, MSE-Future-Loss 15037.534, KL-Loss: 5.536, Kmeans-Loss: 3.165, weight: 1.00 2024-12-31 10:49:02.498 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 14305.539, MSE-Loss: 14296.868, KL-Loss: 5.462, Kmeans-Loss: 3.208 2024-12-31 10:49:02.499 INFO --- [MainThread] vame.model.rnn_vae : 765 : Saving model! 2024-12-31 10:49:02.601 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 6%|ββ | 6/99 [00:26<06:50, 4.41s/epoch]2024-12-31 10:49:06.775 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 25004.333, MSE-Loss: 10921.860, MSE-Future-Loss 14073.429, KL-Loss: 5.773, Kmeans-Loss: 3.270, weight: 1.00 2024-12-31 10:49:07.6 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12627.197, MSE-Loss: 12618.152, KL-Loss: 5.792, Kmeans-Loss: 3.253 2024-12-31 10:49:07.7 INFO --- [MainThread] vame.model.rnn_vae : 765 : Saving model! 2024-12-31 10:49:07.103 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 7%|ββ | 7/99 [00:31<06:48, 4.44s/epoch]2024-12-31 10:49:11.263 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 23014.080, MSE-Loss: 9945.475, MSE-Future-Loss 13059.218, KL-Loss: 6.008, Kmeans-Loss: 3.379, weight: 1.00 2024-12-31 10:49:11.496 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12120.933, MSE-Loss: 12111.664, KL-Loss: 5.884, Kmeans-Loss: 3.386 2024-12-31 10:49:11.497 INFO --- [MainThread] vame.model.rnn_vae : 765 : Saving model! 2024-12-31 10:49:11.609 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 8%|ββ | 8/99 [00:35<06:45, 4.46s/epoch]2024-12-31 10:49:15.788 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 21974.813, MSE-Loss: 9312.015, MSE-Future-Loss 12653.174, KL-Loss: 6.134, Kmeans-Loss: 3.491, weight: 1.00 2024-12-31 10:49:16.21 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12264.385, MSE-Loss: 12254.773, KL-Loss: 6.084, Kmeans-Loss: 3.528 2024-12-31 10:49:16.26 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 9%|βββ | 9/99 [00:39<06:40, 4.45s/epoch]2024-12-31 10:49:20.230 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 20801.181, MSE-Loss: 8675.582, MSE-Future-Loss 12115.709, KL-Loss: 6.287, Kmeans-Loss: 3.603, weight: 1.00 2024-12-31 10:49:20.462 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12210.556, MSE-Loss: 12200.626, KL-Loss: 6.283, Kmeans-Loss: 3.647 2024-12-31 10:49:20.466 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 10%|βββ | 10/99 [00:44<06:35, 4.44s/epoch]2024-12-31 10:49:24.649 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 19450.852, MSE-Loss: 8276.167, MSE-Future-Loss 11164.478, KL-Loss: 6.482, Kmeans-Loss: 3.725, weight: 1.00 2024-12-31 10:49:24.876 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11109.427, MSE-Loss: 11099.176, KL-Loss: 6.513, Kmeans-Loss: 3.738 2024-12-31 10:49:24.877 INFO --- [MainThread] vame.model.rnn_vae : 765 : Saving model! 2024-12-31 10:49:24.970 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 11%|βββ | 11/99 [00:48<06:32, 4.46s/epoch]2024-12-31 10:49:29.165 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 18419.984, MSE-Loss: 7796.613, MSE-Future-Loss 10612.983, KL-Loss: 6.538, Kmeans-Loss: 3.849, weight: 1.00 2024-12-31 10:49:29.390 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12670.061, MSE-Loss: 12659.756, KL-Loss: 6.450, Kmeans-Loss: 3.854 2024-12-31 10:49:29.393 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 12%|βββ | 12/99 [00:53<06:27, 4.45s/epoch]2024-12-31 10:49:33.573 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 17299.674, MSE-Loss: 7399.592, MSE-Future-Loss 9889.431, KL-Loss: 6.680, Kmeans-Loss: 3.971, weight: 1.00 2024-12-31 10:49:33.799 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12296.404, MSE-Loss: 12285.787, KL-Loss: 6.651, Kmeans-Loss: 3.967 2024-12-31 10:49:33.803 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 13%|ββββ | 13/99 [00:57<06:21, 4.44s/epoch]2024-12-31 10:49:38.8 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 16234.121, MSE-Loss: 7034.958, MSE-Future-Loss 9188.311, KL-Loss: 6.781, Kmeans-Loss: 4.072, weight: 1.00 2024-12-31 10:49:38.237 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11957.903, MSE-Loss: 11947.023, KL-Loss: 6.777, Kmeans-Loss: 4.104 2024-12-31 10:49:38.241 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 14%|ββββ | 14/99 [01:02<06:17, 4.44s/epoch]2024-12-31 10:49:42.465 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 15248.095, MSE-Loss: 6750.462, MSE-Future-Loss 8486.541, KL-Loss: 6.911, Kmeans-Loss: 4.181, weight: 1.00 2024-12-31 10:49:42.694 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11967.611, MSE-Loss: 11956.531, KL-Loss: 6.937, Kmeans-Loss: 4.143 2024-12-31 10:49:42.698 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 15%|ββββ | 15/99 [01:06<06:13, 4.44s/epoch]2024-12-31 10:49:46.936 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 14489.168, MSE-Loss: 6391.435, MSE-Future-Loss 8086.503, KL-Loss: 6.966, Kmeans-Loss: 4.264, weight: 1.00 2024-12-31 10:49:47.169 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11901.654, MSE-Loss: 11890.364, KL-Loss: 7.025, Kmeans-Loss: 4.265 2024-12-31 10:49:47.173 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 16%|ββββ | 16/99 [01:11<06:09, 4.45s/epoch]2024-12-31 10:49:51.459 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 13440.516, MSE-Loss: 6045.543, MSE-Future-Loss 7383.531, KL-Loss: 7.079, Kmeans-Loss: 4.363, weight: 1.00 2024-12-31 10:49:51.698 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11882.145, MSE-Loss: 11870.683, KL-Loss: 7.175, Kmeans-Loss: 4.287 2024-12-31 10:49:51.703 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 17%|ββββ | 17/99 [01:15<06:07, 4.48s/epoch]2024-12-31 10:49:56.2 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 12520.237, MSE-Loss: 5754.568, MSE-Future-Loss 6754.025, KL-Loss: 7.230, Kmeans-Loss: 4.415, weight: 1.00 2024-12-31 10:49:56.241 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12153.036, MSE-Loss: 12141.346, KL-Loss: 7.282, Kmeans-Loss: 4.408 2024-12-31 10:49:56.246 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 18%|βββββ | 18/99 [01:20<06:04, 4.50s/epoch]2024-12-31 10:50:00.506 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 12147.974, MSE-Loss: 5612.454, MSE-Future-Loss 6523.711, KL-Loss: 7.286, Kmeans-Loss: 4.524, weight: 1.00 2024-12-31 10:50:00.747 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11881.860, MSE-Loss: 11870.065, KL-Loss: 7.310, Kmeans-Loss: 4.484 2024-12-31 10:50:00.750 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 19%|βββββ | 19/99 [01:24<05:59, 4.50s/epoch]2024-12-31 10:50:05.66 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 11360.369, MSE-Loss: 5364.880, MSE-Future-Loss 5983.501, KL-Loss: 7.394, Kmeans-Loss: 4.594, weight: 1.00 2024-12-31 10:50:05.303 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11269.347, MSE-Loss: 11257.396, KL-Loss: 7.442, Kmeans-Loss: 4.509 2024-12-31 10:50:05.307 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 20%|βββββ | 20/99 [01:29<05:56, 4.52s/epoch]2024-12-31 10:50:09.600 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 10932.085, MSE-Loss: 5193.938, MSE-Future-Loss 5726.032, KL-Loss: 7.471, Kmeans-Loss: 4.644, weight: 1.00 2024-12-31 10:50:09.840 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12691.942, MSE-Loss: 12679.806, KL-Loss: 7.569, Kmeans-Loss: 4.568 2024-12-31 10:50:09.845 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 21%|βββββ | 21/99 [01:33<05:52, 4.52s/epoch]2024-12-31 10:50:14.141 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 10235.297, MSE-Loss: 4956.226, MSE-Future-Loss 5266.758, KL-Loss: 7.599, Kmeans-Loss: 4.713, weight: 1.00 2024-12-31 10:50:14.374 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12945.361, MSE-Loss: 12933.130, KL-Loss: 7.614, Kmeans-Loss: 4.617 2024-12-31 10:50:14.377 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 22%|ββββββ | 22/99 [01:38<05:48, 4.53s/epoch]2024-12-31 10:50:18.714 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 9920.698, MSE-Loss: 4834.142, MSE-Future-Loss 5074.137, KL-Loss: 7.661, Kmeans-Loss: 4.758, weight: 1.00 2024-12-31 10:50:18.954 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12111.777, MSE-Loss: 12099.392, KL-Loss: 7.693, Kmeans-Loss: 4.691 2024-12-31 10:50:18.960 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 23%|ββββββ | 23/99 [01:42<05:45, 4.54s/epoch]2024-12-31 10:50:23.272 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 9640.480, MSE-Loss: 4707.304, MSE-Future-Loss 4920.637, KL-Loss: 7.728, Kmeans-Loss: 4.811, weight: 1.00 2024-12-31 10:50:23.512 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11049.633, MSE-Loss: 11037.220, KL-Loss: 7.712, Kmeans-Loss: 4.702 2024-12-31 10:50:23.512 INFO --- [MainThread] vame.model.rnn_vae : 765 : Saving model! 2024-12-31 10:50:23.608 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 24%|ββββββ | 24/99 [01:47<05:43, 4.57s/epoch]2024-12-31 10:50:27.937 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 9313.579, MSE-Loss: 4578.450, MSE-Future-Loss 4722.499, KL-Loss: 7.763, Kmeans-Loss: 4.867, weight: 1.00 2024-12-31 10:50:28.194 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11646.799, MSE-Loss: 11634.163, KL-Loss: 7.853, Kmeans-Loss: 4.783 2024-12-31 10:50:28.198 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 25%|ββββββ | 25/99 [01:52<05:38, 4.58s/epoch]2024-12-31 10:50:32.596 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 8907.425, MSE-Loss: 4427.197, MSE-Future-Loss 4467.496, KL-Loss: 7.819, Kmeans-Loss: 4.913, weight: 1.00 2024-12-31 10:50:32.835 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11650.325, MSE-Loss: 11637.625, KL-Loss: 7.908, Kmeans-Loss: 4.791 2024-12-31 10:50:32.840 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 26%|βββββββ | 26/99 [01:56<05:35, 4.60s/epoch]2024-12-31 10:50:37.272 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 8495.087, MSE-Loss: 4295.557, MSE-Future-Loss 4186.707, KL-Loss: 7.896, Kmeans-Loss: 4.928, weight: 1.00 2024-12-31 10:50:37.518 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11360.223, MSE-Loss: 11347.461, KL-Loss: 7.944, Kmeans-Loss: 4.818 2024-12-31 10:50:37.523 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 27%|βββββββ | 27/99 [02:01<05:32, 4.62s/epoch]2024-12-31 10:50:41.979 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 8316.070, MSE-Loss: 4242.873, MSE-Future-Loss 4060.239, KL-Loss: 7.991, Kmeans-Loss: 4.967, weight: 1.00 2024-12-31 10:50:42.223 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13228.862, MSE-Loss: 13216.039, KL-Loss: 7.968, Kmeans-Loss: 4.855 2024-12-31 10:50:42.228 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 28%|βββββββ | 28/99 [02:06<05:29, 4.65s/epoch]2024-12-31 10:50:46.695 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 8079.965, MSE-Loss: 4140.828, MSE-Future-Loss 3926.104, KL-Loss: 8.036, Kmeans-Loss: 4.996, weight: 1.00 2024-12-31 10:50:46.937 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 10695.721, MSE-Loss: 10682.754, KL-Loss: 8.093, Kmeans-Loss: 4.874 2024-12-31 10:50:46.938 INFO --- [MainThread] vame.model.rnn_vae : 765 : Saving model! 2024-12-31 10:50:47.25 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 29%|βββββββ | 29/99 [02:10<05:28, 4.69s/epoch]2024-12-31 10:50:51.522 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 7925.741, MSE-Loss: 4063.433, MSE-Future-Loss 3849.163, KL-Loss: 8.096, Kmeans-Loss: 5.049, weight: 1.00 2024-12-31 10:50:51.771 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11184.991, MSE-Loss: 11171.850, KL-Loss: 8.166, Kmeans-Loss: 4.976 2024-12-31 10:50:51.776 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 30%|ββββββββ | 30/99 [02:15<05:25, 4.71s/epoch]2024-12-31 10:50:56.400 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 7768.314, MSE-Loss: 4000.246, MSE-Future-Loss 3754.889, KL-Loss: 8.107, Kmeans-Loss: 5.071, weight: 1.00 2024-12-31 10:50:56.646 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12010.083, MSE-Loss: 11996.982, KL-Loss: 8.138, Kmeans-Loss: 4.962 2024-12-31 10:50:56.651 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 31%|ββββββββ | 31/99 [02:20<05:23, 4.76s/epoch]2024-12-31 10:51:01.177 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 7400.387, MSE-Loss: 3868.064, MSE-Future-Loss 3519.063, KL-Loss: 8.167, Kmeans-Loss: 5.094, weight: 1.00 2024-12-31 10:51:01.426 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12712.014, MSE-Loss: 12698.897, KL-Loss: 8.184, Kmeans-Loss: 4.932 2024-12-31 10:51:01.432 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 32%|ββββββββ | 32/99 [02:25<05:19, 4.77s/epoch]2024-12-31 10:51:05.960 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 7219.119, MSE-Loss: 3754.673, MSE-Future-Loss 3451.100, KL-Loss: 8.226, Kmeans-Loss: 5.121, weight: 1.00 2024-12-31 10:51:06.214 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12153.516, MSE-Loss: 12140.293, KL-Loss: 8.221, Kmeans-Loss: 5.001 2024-12-31 10:51:06.218 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 33%|ββββββββ | 33/99 [02:30<05:14, 4.77s/epoch]2024-12-31 10:51:10.784 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 7099.350, MSE-Loss: 3724.607, MSE-Future-Loss 3361.379, KL-Loss: 8.227, Kmeans-Loss: 5.137, weight: 1.00 2024-12-31 10:51:11.39 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11518.097, MSE-Loss: 11504.729, KL-Loss: 8.319, Kmeans-Loss: 5.050 2024-12-31 10:51:11.44 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 34%|βββββββββ | 34/99 [02:34<05:11, 4.79s/epoch]2024-12-31 10:51:15.700 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6799.701, MSE-Loss: 3572.713, MSE-Future-Loss 3213.564, KL-Loss: 8.263, Kmeans-Loss: 5.160, weight: 1.00 2024-12-31 10:51:15.954 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12058.775, MSE-Loss: 12045.434, KL-Loss: 8.319, Kmeans-Loss: 5.022 2024-12-31 10:51:15.959 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 35%|βββββββββ | 35/99 [02:39<05:08, 4.83s/epoch]2024-12-31 10:51:20.624 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6721.368, MSE-Loss: 3574.570, MSE-Future-Loss 3133.334, KL-Loss: 8.290, Kmeans-Loss: 5.174, weight: 1.00 2024-12-31 10:51:20.876 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 10138.295, MSE-Loss: 10124.873, KL-Loss: 8.363, Kmeans-Loss: 5.059 2024-12-31 10:51:20.876 INFO --- [MainThread] vame.model.rnn_vae : 765 : Saving model! 2024-12-31 10:51:20.950 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 36%|βββββββββ | 36/99 [02:44<05:07, 4.88s/epoch]2024-12-31 10:51:25.606 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6542.092, MSE-Loss: 3458.747, MSE-Future-Loss 3069.828, KL-Loss: 8.310, Kmeans-Loss: 5.207, weight: 1.00 2024-12-31 10:51:25.862 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11247.044, MSE-Loss: 11233.548, KL-Loss: 8.384, Kmeans-Loss: 5.113 2024-12-31 10:51:25.868 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 37%|βββββββββ | 37/99 [02:49<05:03, 4.89s/epoch]2024-12-31 10:51:30.789 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6369.867, MSE-Loss: 3387.531, MSE-Future-Loss 2968.769, KL-Loss: 8.344, Kmeans-Loss: 5.223, weight: 1.00 2024-12-31 10:51:31.56 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 10276.283, MSE-Loss: 10262.674, KL-Loss: 8.507, Kmeans-Loss: 5.102 2024-12-31 10:51:31.61 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 38%|ββββββββββ | 38/99 [02:54<05:03, 4.98s/epoch]2024-12-31 10:51:35.905 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6248.292, MSE-Loss: 3343.623, MSE-Future-Loss 2891.035, KL-Loss: 8.384, Kmeans-Loss: 5.251, weight: 1.00 2024-12-31 10:51:36.169 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12576.658, MSE-Loss: 12563.120, KL-Loss: 8.405, Kmeans-Loss: 5.134 2024-12-31 10:51:36.174 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 39%|ββββββββββ | 39/99 [03:00<05:01, 5.02s/epoch]2024-12-31 10:51:40.893 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6196.308, MSE-Loss: 3334.137, MSE-Future-Loss 2848.498, KL-Loss: 8.403, Kmeans-Loss: 5.270, weight: 1.00 2024-12-31 10:51:41.153 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11182.363, MSE-Loss: 11168.838, KL-Loss: 8.415, Kmeans-Loss: 5.111 2024-12-31 10:51:41.158 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 40%|ββββββββββ | 40/99 [03:05<04:55, 5.01s/epoch]2024-12-31 10:51:46.219 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6121.569, MSE-Loss: 3280.063, MSE-Future-Loss 2827.808, KL-Loss: 8.417, Kmeans-Loss: 5.281, weight: 1.00 2024-12-31 10:51:46.480 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11597.154, MSE-Loss: 11583.487, KL-Loss: 8.510, Kmeans-Loss: 5.156 2024-12-31 10:51:46.484 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 41%|ββββββββββ | 41/99 [03:10<04:56, 5.10s/epoch]2024-12-31 10:51:51.292 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 6042.748, MSE-Loss: 3206.775, MSE-Future-Loss 2822.280, KL-Loss: 8.399, Kmeans-Loss: 5.295, weight: 1.00 2024-12-31 10:51:51.552 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12189.587, MSE-Loss: 12176.022, KL-Loss: 8.433, Kmeans-Loss: 5.133 2024-12-31 10:51:51.558 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 42%|βββββββββββ | 42/99 [03:15<04:50, 5.10s/epoch]2024-12-31 10:51:56.560 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5869.520, MSE-Loss: 3180.577, MSE-Future-Loss 2675.187, KL-Loss: 8.437, Kmeans-Loss: 5.320, weight: 1.00 2024-12-31 10:51:56.868 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12086.801, MSE-Loss: 12073.156, KL-Loss: 8.468, Kmeans-Loss: 5.177 2024-12-31 10:51:56.872 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 43%|βββββββββββ | 43/99 [03:20<04:48, 5.16s/epoch]2024-12-31 10:52:02.80 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5743.358, MSE-Loss: 3076.954, MSE-Future-Loss 2652.620, KL-Loss: 8.455, Kmeans-Loss: 5.328, weight: 1.00 2024-12-31 10:52:02.395 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12398.257, MSE-Loss: 12384.495, KL-Loss: 8.552, Kmeans-Loss: 5.210 2024-12-31 10:52:02.401 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 44%|βββββββββββ | 44/99 [03:26<04:49, 5.27s/epoch]2024-12-31 10:52:07.637 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5634.218, MSE-Loss: 3042.991, MSE-Future-Loss 2577.404, KL-Loss: 8.481, Kmeans-Loss: 5.342, weight: 1.00 2024-12-31 10:52:07.943 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12615.915, MSE-Loss: 12602.204, KL-Loss: 8.512, Kmeans-Loss: 5.200 2024-12-31 10:52:07.950 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 45%|βββββββββββ | 45/99 [03:31<04:49, 5.35s/epoch]2024-12-31 10:52:13.216 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5553.304, MSE-Loss: 2984.178, MSE-Future-Loss 2555.267, KL-Loss: 8.501, Kmeans-Loss: 5.359, weight: 1.00 2024-12-31 10:52:13.524 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12130.054, MSE-Loss: 12116.283, KL-Loss: 8.525, Kmeans-Loss: 5.246 2024-12-31 10:52:13.528 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 46%|ββββββββββββ | 46/99 [03:37<04:47, 5.42s/epoch]2024-12-31 10:52:18.658 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5374.721, MSE-Loss: 2937.816, MSE-Future-Loss 2422.997, KL-Loss: 8.525, Kmeans-Loss: 5.382, weight: 1.00 2024-12-31 10:52:18.913 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12105.320, MSE-Loss: 12091.486, KL-Loss: 8.613, Kmeans-Loss: 5.221 2024-12-31 10:52:18.919 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 47%|ββββββββββββ | 47/99 [03:42<04:41, 5.41s/epoch]2024-12-31 10:52:24.283 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5408.144, MSE-Loss: 2966.736, MSE-Future-Loss 2427.517, KL-Loss: 8.508, Kmeans-Loss: 5.384, weight: 1.00 2024-12-31 10:52:24.566 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12777.801, MSE-Loss: 12763.994, KL-Loss: 8.579, Kmeans-Loss: 5.229 2024-12-31 10:52:24.572 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 48%|ββββββββββββ | 48/99 [03:48<04:39, 5.48s/epoch]2024-12-31 10:52:29.519 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5259.246, MSE-Loss: 2866.893, MSE-Future-Loss 2378.412, KL-Loss: 8.543, Kmeans-Loss: 5.397, weight: 1.00 2024-12-31 10:52:29.783 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12088.041, MSE-Loss: 12074.153, KL-Loss: 8.638, Kmeans-Loss: 5.249 2024-12-31 10:52:29.787 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 49%|ββββββββββββ | 49/99 [03:53<04:30, 5.40s/epoch]2024-12-31 10:52:34.800 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5172.903, MSE-Loss: 2838.018, MSE-Future-Loss 2320.916, KL-Loss: 8.563, Kmeans-Loss: 5.406, weight: 1.00 2024-12-31 10:52:35.64 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12177.824, MSE-Loss: 12163.945, KL-Loss: 8.615, Kmeans-Loss: 5.264 2024-12-31 10:52:35.65 INFO --- [MainThread] vame.model.rnn_vae : 780 : Saving model snapshot! 2024-12-31 10:52:35.156 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 51%|ββββββββββββ | 50/99 [03:59<04:24, 5.39s/epoch]2024-12-31 10:52:40.334 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 5049.278, MSE-Loss: 2751.792, MSE-Future-Loss 2283.515, KL-Loss: 8.563, Kmeans-Loss: 5.409, weight: 1.00 2024-12-31 10:52:40.598 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11950.095, MSE-Loss: 11936.077, KL-Loss: 8.713, Kmeans-Loss: 5.304 2024-12-31 10:52:40.603 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 52%|βββββββββββββ | 51/99 [04:04<04:19, 5.41s/epoch]2024-12-31 10:52:45.425 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4973.709, MSE-Loss: 2749.108, MSE-Future-Loss 2210.572, KL-Loss: 8.600, Kmeans-Loss: 5.429, weight: 1.00 2024-12-31 10:52:45.688 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12608.591, MSE-Loss: 12594.639, KL-Loss: 8.689, Kmeans-Loss: 5.263 2024-12-31 10:52:45.692 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 53%|βββββββββββββ | 52/99 [04:09<04:09, 5.31s/epoch]2024-12-31 10:52:50.959 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4935.411, MSE-Loss: 2723.368, MSE-Future-Loss 2198.044, KL-Loss: 8.584, Kmeans-Loss: 5.414, weight: 1.00 2024-12-31 10:52:51.273 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11977.132, MSE-Loss: 11963.129, KL-Loss: 8.702, Kmeans-Loss: 5.302 2024-12-31 10:52:51.277 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 54%|βββββββββββββ | 53/99 [04:15<04:08, 5.39s/epoch]2024-12-31 10:52:56.563 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4880.099, MSE-Loss: 2678.467, MSE-Future-Loss 2187.581, KL-Loss: 8.603, Kmeans-Loss: 5.447, weight: 1.00 2024-12-31 10:52:56.815 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12631.327, MSE-Loss: 12617.344, KL-Loss: 8.683, Kmeans-Loss: 5.300 2024-12-31 10:52:56.819 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 55%|βββββββββββββ | 54/99 [04:20<04:04, 5.44s/epoch]2024-12-31 10:53:01.750 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4824.599, MSE-Loss: 2668.213, MSE-Future-Loss 2142.307, KL-Loss: 8.615, Kmeans-Loss: 5.464, weight: 1.00 2024-12-31 10:53:02.19 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11803.047, MSE-Loss: 11788.985, KL-Loss: 8.749, Kmeans-Loss: 5.313 2024-12-31 10:53:02.23 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 56%|ββββββββββββββ | 55/99 [04:25<03:56, 5.37s/epoch]2024-12-31 10:53:07.390 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4735.557, MSE-Loss: 2608.672, MSE-Future-Loss 2112.775, KL-Loss: 8.633, Kmeans-Loss: 5.477, weight: 1.00 2024-12-31 10:53:07.684 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13054.707, MSE-Loss: 13040.675, KL-Loss: 8.700, Kmeans-Loss: 5.331 2024-12-31 10:53:07.688 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 57%|ββββββββββββββ | 56/99 [04:31<03:54, 5.46s/epoch]2024-12-31 10:53:12.597 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4694.348, MSE-Loss: 2597.420, MSE-Future-Loss 2082.818, KL-Loss: 8.638, Kmeans-Loss: 5.471, weight: 1.00 2024-12-31 10:53:12.872 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12450.549, MSE-Loss: 12436.462, KL-Loss: 8.719, Kmeans-Loss: 5.367 2024-12-31 10:53:12.878 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 58%|ββββββββββββββ | 57/99 [04:36<03:45, 5.38s/epoch]2024-12-31 10:53:17.982 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4538.256, MSE-Loss: 2539.709, MSE-Future-Loss 1984.456, KL-Loss: 8.621, Kmeans-Loss: 5.471, weight: 1.00 2024-12-31 10:53:18.305 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12159.281, MSE-Loss: 12145.224, KL-Loss: 8.731, Kmeans-Loss: 5.326 2024-12-31 10:53:18.308 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 59%|ββββββββββββββ | 58/99 [04:42<03:41, 5.39s/epoch]2024-12-31 10:53:23.354 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4508.091, MSE-Loss: 2483.348, MSE-Future-Loss 2010.596, KL-Loss: 8.649, Kmeans-Loss: 5.498, weight: 1.00 2024-12-31 10:53:23.621 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12373.946, MSE-Loss: 12359.884, KL-Loss: 8.728, Kmeans-Loss: 5.334 2024-12-31 10:53:23.625 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 60%|βββββββββββββββ | 59/99 [04:47<03:34, 5.37s/epoch]2024-12-31 10:53:28.765 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4513.100, MSE-Loss: 2499.941, MSE-Future-Loss 1999.004, KL-Loss: 8.651, Kmeans-Loss: 5.504, weight: 1.00 2024-12-31 10:53:29.37 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12040.639, MSE-Loss: 12026.533, KL-Loss: 8.788, Kmeans-Loss: 5.318 2024-12-31 10:53:29.41 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 61%|βββββββββββββββ | 60/99 [04:52<03:29, 5.38s/epoch]2024-12-31 10:53:34.171 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4401.390, MSE-Loss: 2426.480, MSE-Future-Loss 1960.751, KL-Loss: 8.655, Kmeans-Loss: 5.504, weight: 1.00 2024-12-31 10:53:34.436 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13054.194, MSE-Loss: 13040.071, KL-Loss: 8.762, Kmeans-Loss: 5.361 2024-12-31 10:53:34.440 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 62%|βββββββββββββββ | 61/99 [04:58<03:24, 5.39s/epoch]2024-12-31 10:53:39.338 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4447.498, MSE-Loss: 2447.173, MSE-Future-Loss 1986.153, KL-Loss: 8.652, Kmeans-Loss: 5.520, weight: 1.00 2024-12-31 10:53:39.608 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13580.776, MSE-Loss: 13566.731, KL-Loss: 8.727, Kmeans-Loss: 5.318 2024-12-31 10:53:39.613 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 63%|βββββββββββββββ | 62/99 [05:03<03:16, 5.32s/epoch]2024-12-31 10:53:44.500 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4363.093, MSE-Loss: 2421.929, MSE-Future-Loss 1927.010, KL-Loss: 8.643, Kmeans-Loss: 5.511, weight: 1.00 2024-12-31 10:53:44.773 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12507.861, MSE-Loss: 12493.767, KL-Loss: 8.767, Kmeans-Loss: 5.328 2024-12-31 10:53:44.777 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 64%|ββββββββββββββββ | 63/99 [05:08<03:09, 5.28s/epoch]2024-12-31 10:53:49.657 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4277.510, MSE-Loss: 2373.647, MSE-Future-Loss 1889.689, KL-Loss: 8.655, Kmeans-Loss: 5.519, weight: 1.00 2024-12-31 10:53:49.921 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11436.922, MSE-Loss: 11422.790, KL-Loss: 8.779, Kmeans-Loss: 5.353 2024-12-31 10:53:49.927 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 65%|ββββββββββββββββ | 64/99 [05:13<03:03, 5.24s/epoch]2024-12-31 10:53:54.776 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4226.315, MSE-Loss: 2329.496, MSE-Future-Loss 1882.653, KL-Loss: 8.640, Kmeans-Loss: 5.526, weight: 1.00 2024-12-31 10:53:55.44 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11367.065, MSE-Loss: 11352.918, KL-Loss: 8.787, Kmeans-Loss: 5.360 2024-12-31 10:53:55.49 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 66%|ββββββββββββββββ | 65/99 [05:18<02:56, 5.20s/epoch]2024-12-31 10:53:59.918 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4178.731, MSE-Loss: 2322.687, MSE-Future-Loss 1841.843, KL-Loss: 8.661, Kmeans-Loss: 5.540, weight: 1.00 2024-12-31 10:54:00.193 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13874.658, MSE-Loss: 13860.523, KL-Loss: 8.758, Kmeans-Loss: 5.377 2024-12-31 10:54:00.197 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 67%|ββββββββββββββββ | 66/99 [05:24<02:51, 5.19s/epoch]2024-12-31 10:54:05.71 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4121.438, MSE-Loss: 2321.774, MSE-Future-Loss 1785.441, KL-Loss: 8.676, Kmeans-Loss: 5.546, weight: 1.00 2024-12-31 10:54:05.340 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13061.111, MSE-Loss: 13046.942, KL-Loss: 8.772, Kmeans-Loss: 5.397 2024-12-31 10:54:05.344 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 68%|βββββββββββββββββ | 67/99 [05:29<02:45, 5.17s/epoch]2024-12-31 10:54:10.265 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3988.468, MSE-Loss: 2261.874, MSE-Future-Loss 1712.394, KL-Loss: 8.668, Kmeans-Loss: 5.532, weight: 1.00 2024-12-31 10:54:10.527 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12443.517, MSE-Loss: 12429.275, KL-Loss: 8.858, Kmeans-Loss: 5.384 2024-12-31 10:54:10.533 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 69%|βββββββββββββββββ | 68/99 [05:34<02:40, 5.18s/epoch]2024-12-31 10:54:15.461 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4068.903, MSE-Loss: 2307.784, MSE-Future-Loss 1746.884, KL-Loss: 8.682, Kmeans-Loss: 5.554, weight: 1.00 2024-12-31 10:54:15.721 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11846.890, MSE-Loss: 11832.764, KL-Loss: 8.756, Kmeans-Loss: 5.370 2024-12-31 10:54:15.725 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 70%|βββββββββββββββββ | 69/99 [05:39<02:35, 5.18s/epoch]2024-12-31 10:54:20.657 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 4013.026, MSE-Loss: 2251.518, MSE-Future-Loss 1747.310, KL-Loss: 8.661, Kmeans-Loss: 5.537, weight: 1.00 2024-12-31 10:54:20.924 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12319.404, MSE-Loss: 12305.203, KL-Loss: 8.809, Kmeans-Loss: 5.392 2024-12-31 10:54:20.929 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 71%|βββββββββββββββββ | 70/99 [05:44<02:30, 5.19s/epoch]2024-12-31 10:54:25.825 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3926.618, MSE-Loss: 2217.272, MSE-Future-Loss 1695.127, KL-Loss: 8.675, Kmeans-Loss: 5.544, weight: 1.00 2024-12-31 10:54:26.97 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11558.306, MSE-Loss: 11544.099, KL-Loss: 8.827, Kmeans-Loss: 5.380 2024-12-31 10:54:26.101 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 72%|ββββββββββββββββββ | 71/99 [05:50<02:25, 5.18s/epoch]2024-12-31 10:54:31.566 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3930.680, MSE-Loss: 2211.946, MSE-Future-Loss 1704.509, KL-Loss: 8.673, Kmeans-Loss: 5.552, weight: 1.00 2024-12-31 10:54:31.838 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11278.867, MSE-Loss: 11264.647, KL-Loss: 8.814, Kmeans-Loss: 5.405 2024-12-31 10:54:31.844 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 73%|ββββββββββββββββββ | 72/99 [05:55<02:24, 5.35s/epoch]2024-12-31 10:54:36.945 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3803.414, MSE-Loss: 2148.740, MSE-Future-Loss 1640.435, KL-Loss: 8.677, Kmeans-Loss: 5.562, weight: 1.00 2024-12-31 10:54:37.210 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12771.144, MSE-Loss: 12756.893, KL-Loss: 8.822, Kmeans-Loss: 5.429 2024-12-31 10:54:37.217 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 74%|ββββββββββββββββββ | 73/99 [06:01<02:19, 5.36s/epoch]2024-12-31 10:54:42.213 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3797.299, MSE-Loss: 2109.386, MSE-Future-Loss 1673.665, KL-Loss: 8.676, Kmeans-Loss: 5.571, weight: 1.00 2024-12-31 10:54:42.481 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11918.242, MSE-Loss: 11904.022, KL-Loss: 8.804, Kmeans-Loss: 5.416 2024-12-31 10:54:42.485 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 75%|ββββββββββββββββββ | 74/99 [06:06<02:13, 5.33s/epoch]2024-12-31 10:54:47.412 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3723.227, MSE-Loss: 2080.748, MSE-Future-Loss 1628.259, KL-Loss: 8.673, Kmeans-Loss: 5.546, weight: 1.00 2024-12-31 10:54:47.680 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13163.930, MSE-Loss: 13149.713, KL-Loss: 8.791, Kmeans-Loss: 5.426 2024-12-31 10:54:47.684 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 76%|βββββββββββββββββββ | 75/99 [06:11<02:06, 5.29s/epoch]2024-12-31 10:54:52.703 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3743.610, MSE-Loss: 2116.620, MSE-Future-Loss 1612.723, KL-Loss: 8.689, Kmeans-Loss: 5.578, weight: 1.00 2024-12-31 10:54:52.970 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 10702.137, MSE-Loss: 10687.948, KL-Loss: 8.825, Kmeans-Loss: 5.364 2024-12-31 10:54:52.973 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 77%|βββββββββββββββββββ | 76/99 [06:16<02:01, 5.29s/epoch]2024-12-31 10:54:57.988 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3676.930, MSE-Loss: 2084.201, MSE-Future-Loss 1578.464, KL-Loss: 8.688, Kmeans-Loss: 5.577, weight: 1.00 2024-12-31 10:54:58.260 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13238.156, MSE-Loss: 13223.945, KL-Loss: 8.805, Kmeans-Loss: 5.406 2024-12-31 10:54:58.266 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 78%|βββββββββββββββββββ | 77/99 [06:22<01:56, 5.29s/epoch]2024-12-31 10:55:03.197 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3706.438, MSE-Loss: 2123.604, MSE-Future-Loss 1568.608, KL-Loss: 8.666, Kmeans-Loss: 5.560, weight: 1.00 2024-12-31 10:55:03.527 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12179.523, MSE-Loss: 12165.311, KL-Loss: 8.803, Kmeans-Loss: 5.410 2024-12-31 10:55:03.531 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 79%|βββββββββββββββββββ | 78/99 [06:27<01:50, 5.28s/epoch]2024-12-31 10:55:08.707 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3536.617, MSE-Loss: 2005.998, MSE-Future-Loss 1516.344, KL-Loss: 8.696, Kmeans-Loss: 5.580, weight: 1.00 2024-12-31 10:55:08.980 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11348.329, MSE-Loss: 11333.996, KL-Loss: 8.888, Kmeans-Loss: 5.445 2024-12-31 10:55:08.985 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 80%|ββββββββββββββββββββ | 79/99 [06:32<01:46, 5.33s/epoch]2024-12-31 10:55:14.24 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3571.754, MSE-Loss: 2030.664, MSE-Future-Loss 1526.810, KL-Loss: 8.696, Kmeans-Loss: 5.585, weight: 1.00 2024-12-31 10:55:14.293 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12886.512, MSE-Loss: 12872.306, KL-Loss: 8.784, Kmeans-Loss: 5.422 2024-12-31 10:55:14.300 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 81%|ββββββββββββββββββββ | 80/99 [06:38<01:41, 5.33s/epoch]2024-12-31 10:55:19.422 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3615.666, MSE-Loss: 2035.306, MSE-Future-Loss 1566.063, KL-Loss: 8.696, Kmeans-Loss: 5.601, weight: 1.00 2024-12-31 10:55:19.689 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12607.526, MSE-Loss: 12593.256, KL-Loss: 8.818, Kmeans-Loss: 5.452 2024-12-31 10:55:19.693 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 82%|ββββββββββββββββββββ | 81/99 [06:43<01:36, 5.35s/epoch]2024-12-31 10:55:25.168 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3595.239, MSE-Loss: 2044.750, MSE-Future-Loss 1536.208, KL-Loss: 8.686, Kmeans-Loss: 5.596, weight: 1.00 2024-12-31 10:55:25.435 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13110.784, MSE-Loss: 13096.539, KL-Loss: 8.812, Kmeans-Loss: 5.432 2024-12-31 10:55:25.439 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 83%|ββββββββββββββββββββ | 82/99 [06:49<01:32, 5.47s/epoch]2024-12-31 10:55:30.754 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3463.819, MSE-Loss: 1960.778, MSE-Future-Loss 1488.791, KL-Loss: 8.668, Kmeans-Loss: 5.581, weight: 1.00 2024-12-31 10:55:31.59 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 12169.386, MSE-Loss: 12155.147, KL-Loss: 8.824, Kmeans-Loss: 5.414 2024-12-31 10:55:31.66 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 84%|ββββββββββββββββββββ | 83/99 [06:54<01:28, 5.52s/epoch]2024-12-31 10:55:36.165 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3459.524, MSE-Loss: 1947.573, MSE-Future-Loss 1497.671, KL-Loss: 8.688, Kmeans-Loss: 5.593, weight: 1.00 2024-12-31 10:55:36.437 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13602.193, MSE-Loss: 13587.905, KL-Loss: 8.846, Kmeans-Loss: 5.442 2024-12-31 10:55:36.442 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 85%|βββββββββββββββββββββ | 84/99 [07:00<01:22, 5.47s/epoch]2024-12-31 10:55:41.486 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3473.749, MSE-Loss: 1975.308, MSE-Future-Loss 1484.141, KL-Loss: 8.694, Kmeans-Loss: 5.606, weight: 1.00 2024-12-31 10:55:41.759 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 13463.085, MSE-Loss: 13448.867, KL-Loss: 8.806, Kmeans-Loss: 5.412 2024-12-31 10:55:41.765 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 86%|βββββββββββββββββββββ | 85/99 [07:05<01:15, 5.43s/epoch]2024-12-31 10:55:46.707 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3374.322, MSE-Loss: 1914.414, MSE-Future-Loss 1445.615, KL-Loss: 8.697, Kmeans-Loss: 5.596, weight: 1.00 2024-12-31 10:55:46.977 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11963.074, MSE-Loss: 11948.785, KL-Loss: 8.832, Kmeans-Loss: 5.457 2024-12-31 10:55:46.982 INFO --- [MainThread] vame.model.rnn_vae : 879 : Training Model: 87%|βββββββββββββββββββββ | 86/99 [07:10<01:09, 5.36s/epoch]2024-12-31 10:55:51.970 INFO --- [MainThread] vame.model.rnn_vae : 348 : Train loss: 3365.857, MSE-Loss: 1914.004, MSE-Future-Loss 1437.527, KL-Loss: 8.708, Kmeans-Loss: 5.618, weight: 1.00 2024-12-31 10:55:52.241 INFO --- [MainThread] vame.model.rnn_vae : 462 : Test loss: 11090.282, MSE-Loss: 11075.988, KL-Loss: 8.879, Kmeans-Loss: 5.415 2024-12-31 10:55:52.242 INFO --- [MainThread] vame.model.rnn_vae : 793 : Finished training... 2024-12-31 10:55:52.243 INFO --- [MainThread] vame.model.rnn_vae : 794 : Model converged. Please check your model with vame.evaluate_model(). You can also re-run vame.trainmodel() to further improve your model. Make sure to set _pretrained_weights_ in your config.yaml to "true" and plug your current model name into _pretrained_model_. Hint: Set "model_convergence" in your config.yaml to a higher value. Next: Use vame.segment_session() to identify behavioral motifs in your dataset! Training Model: 87%|βββββββββββββββββββββ | 86/99 [07:16<01:05, 5.07s/epoch] 2024-12-31 10:55:52.249 INFO --- [MainThread] vame.model.evaluate : 222 : Using CUDA 2024-12-31 10:55:52.249 INFO --- [MainThread] vame.model.evaluate : 223 : GPU active: True 2024-12-31 10:55:52.250 INFO --- [MainThread] vame.model.evaluate : 224 : GPU used: NVIDIA GeForce RTX 2060 2024-12-31 10:55:52.251 INFO --- [MainThread] vame.model.evaluate : 229 : Evaluation of model: VAME 2024-12-31 10:55:52.290 INFO --- [MainThread] vame.model.evaluate : 64 : Initialize test data. Datapoints 1848 2024-12-31 10:55:52.626 INFO --- [MainThread] vame.model.evaluate : 251 : You can find the results of the evaluation in '/Your-VAME-Project/model/evaluate/' OPTIONS: - vame.segment_session() to identify behavioral motifs. - re-run the model for further fine tuning. Check again with vame.evaluate_model() 2024-12-31 10:55:52.632 INFO --- [MainThread] vame.analysis.pose_segmentation : 318 : Pose segmentation for VAME model: VAME 2024-12-31 10:55:52.632 INFO --- [MainThread] vame.analysis.pose_segmentation : 319 : Segmentation algorithms: ['hmm', 'kmeans'] 2024-12-31 10:55:52.633 INFO --- [MainThread] vame.analysis.pose_segmentation : 322 : Running pose segmentation using hmm algorithm... 2024-12-31 10:55:52.635 INFO --- [MainThread] vame.analysis.pose_segmentation : 354 : Using CUDA 2024-12-31 10:55:52.635 INFO --- [MainThread] vame.analysis.pose_segmentation : 355 : GPU active: True 2024-12-31 10:55:52.636 INFO --- [MainThread] vame.analysis.pose_segmentation : 356 : GPU used: NVIDIA GeForce RTX 2060 2024-12-31 10:55:52.637 INFO --- [MainThread] vame.util.model_util : 40 : Loading model... 2024-12-31 10:55:52.677 INFO --- [MainThread] vame.analysis.pose_segmentation : 68 : Embedding of latent vector for file single-mouse_EPM_video 100%|ββββββββββββββββββββββββββββββββββββ| 18455/18455 [00:24<00:00, 741.72it/s] 2024-12-31 10:56:17.610 INFO --- [MainThread] vame.analysis.pose_segmentation : 392 : Apply the same segmentation of latent vectors for all sessions, 15 clusters 2024-12-31 10:56:17.611 INFO --- [MainThread] vame.analysis.pose_segmentation : 177 : Using a HMM as segmentation algorithm! 2024-12-31 10:57:56.783 INFO --- [MainThread] vame.analysis.pose_segmentation : 197 : Getting motif usage for single-mouse_EPM_video 2024-12-31 10:57:56.784 INFO --- [MainThread] vame.analysis.pose_segmentation : 473 : /mnt/shared_storage/Github/VAME/examples/pipeline_example/results/single-mouse_EPM_video/VAME/hmm-15/ 2024-12-31 10:57:56.814 INFO --- [MainThread] vame.analysis.pose_segmentation : 538 : You succesfully extracted motifs with VAME! From here, you can proceed running vame.motif_videos() 2024-12-31 10:57:56.815 INFO --- [MainThread] vame.analysis.pose_segmentation : 322 : Running pose segmentation using kmeans algorithm... 2024-12-31 10:57:56.815 INFO --- [MainThread] vame.analysis.pose_segmentation : 354 : Using CUDA 2024-12-31 10:57:56.816 INFO --- [MainThread] vame.analysis.pose_segmentation : 355 : GPU active: True 2024-12-31 10:57:56.816 INFO --- [MainThread] vame.analysis.pose_segmentation : 356 : GPU used: NVIDIA GeForce RTX 2060 2024-12-31 10:57:56.817 INFO --- [MainThread] vame.util.model_util : 40 : Loading model... 2024-12-31 10:57:56.865 INFO --- [MainThread] vame.analysis.pose_segmentation : 68 : Embedding of latent vector for file single-mouse_EPM_video 100%|ββββββββββββββββββββββββββββββββββββ| 18455/18455 [00:20<00:00, 882.69it/s] 2024-12-31 10:58:17.806 INFO --- [MainThread] vame.analysis.pose_segmentation : 392 : Apply the same segmentation of latent vectors for all sessions, 15 clusters 2024-12-31 10:58:17.807 INFO --- [MainThread] vame.analysis.pose_segmentation : 164 : Using kmeans as segmentation algorithm! 2024-12-31 10:58:18.878 INFO --- [MainThread] vame.analysis.pose_segmentation : 197 : Getting motif usage for single-mouse_EPM_video 2024-12-31 10:58:18.879 INFO --- [MainThread] vame.analysis.pose_segmentation : 473 : /mnt/shared_storage/Github/VAME/examples/pipeline_example/results/single-mouse_EPM_video/VAME/kmeans-15/ 2024-12-31 10:58:18.896 INFO --- [MainThread] vame.analysis.pose_segmentation : 538 : You succesfully extracted motifs with VAME! From here, you can proceed running vame.motif_videos() 2024-12-31 10:58:18.904 INFO --- [MainThread] vame.analysis.community_analysis : 198 : Zero motifs: [] /mnt/shared_storage/Github/VAME/src/vame/analysis/tree_hierarchy.py:42: RuntimeWarning: divide by zero encountered in scalar divide cost = motif_norm[i] + motif_norm[j] / np.abs(transition_matrix[i, j] + transition_matrix[j, i]) /mnt/shared_storage/Github/VAME/src/vame/analysis/tree_hierarchy.py:42: RuntimeWarning: invalid value encountered in scalar divide cost = motif_norm[i] + motif_norm[j] / np.abs(transition_matrix[i, j] + transition_matrix[j, i]) 2024-12-31 10:58:19.160 INFO --- [MainThread] vame.analysis.community_analysis : 378 : Communities bag: 2024-12-31 10:58:19.161 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 0: [8] 2024-12-31 10:58:19.162 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 1: [10, 9, 7] 2024-12-31 10:58:19.162 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 2: [0, 3, 2, 14, 13] 2024-12-31 10:58:19.163 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 3: [5, 6, 11, 4, 12, 1]
Visualize the resultsΒΆ
After running the pipeline, you can visualize the results:
pipeline.visualize_preprocessing(
show_figure=True,
save_to_file=False,
)
pipeline.visualize_model_losses(
show_figure=True,
save_to_file=False,
)
pipeline.visualize_motif_tree(segmentation_algorithm="hmm")
pipeline.visualize_umap(
label="community",
segmentation_algorithm="hmm",
show_figure=True,
)
2024-12-31 10:58:22.158 INFO --- [MainThread] vame.visualization.umap : 301 : Compute embedding for session single-mouse_EPM_video 2024-12-31 10:58:22.159 INFO --- [MainThread] vame.visualization.umap : 52 : UMAP calculation for session single-mouse_EPM_video 2024-12-31 10:58:22.161 INFO --- [MainThread] vame.visualization.umap : 65 : Embedding 18455 data points... /home/luiz/anaconda3/envs/vame/lib/python3.11/site-packages/umap/umap_.py:1945: UserWarning: n_jobs value 1 overridden to 1 by setting random_state. Use no seed for parallelism. warn(f"n_jobs value {self.n_jobs} overridden to 1 by setting random_state. Use no seed for parallelism.")
Produce the pipeline reportΒΆ
pipeline.report()
2024-12-31 10:59:13.486 INFO --- [MainThread] vame.util.report : 79 : Cohort communities: 2024-12-31 10:59:13.486 INFO --- [MainThread] vame.util.report : 81 : Community 0: 2210 counts 2024-12-31 10:59:13.487 INFO --- [MainThread] vame.util.report : 83 : Motif 8: 2210 counts 2024-12-31 10:59:13.488 INFO --- [MainThread] vame.util.report : 81 : Community 1: 7244 counts 2024-12-31 10:59:13.489 INFO --- [MainThread] vame.util.report : 83 : Motif 10: 4139 counts 2024-12-31 10:59:13.489 INFO --- [MainThread] vame.util.report : 83 : Motif 9: 2101 counts 2024-12-31 10:59:13.490 INFO --- [MainThread] vame.util.report : 83 : Motif 7: 1004 counts 2024-12-31 10:59:13.491 INFO --- [MainThread] vame.util.report : 81 : Community 2: 4907 counts 2024-12-31 10:59:13.491 INFO --- [MainThread] vame.util.report : 83 : Motif 0: 1578 counts 2024-12-31 10:59:13.492 INFO --- [MainThread] vame.util.report : 83 : Motif 3: 525 counts 2024-12-31 10:59:13.492 INFO --- [MainThread] vame.util.report : 83 : Motif 2: 1077 counts 2024-12-31 10:59:13.493 INFO --- [MainThread] vame.util.report : 83 : Motif 14: 109 counts 2024-12-31 10:59:13.493 INFO --- [MainThread] vame.util.report : 83 : Motif 13: 1619 counts 2024-12-31 10:59:13.494 INFO --- [MainThread] vame.util.report : 81 : Community 3: 4094 counts 2024-12-31 10:59:13.494 INFO --- [MainThread] vame.util.report : 83 : Motif 5: 1044 counts 2024-12-31 10:59:13.495 INFO --- [MainThread] vame.util.report : 83 : Motif 6: 1053 counts 2024-12-31 10:59:13.496 INFO --- [MainThread] vame.util.report : 83 : Motif 11: 587 counts 2024-12-31 10:59:13.496 INFO --- [MainThread] vame.util.report : 83 : Motif 4: 322 counts 2024-12-31 10:59:13.497 INFO --- [MainThread] vame.util.report : 83 : Motif 12: 262 counts 2024-12-31 10:59:13.497 INFO --- [MainThread] vame.util.report : 83 : Motif 1: 825 counts 2024-12-31 10:59:13.847 INFO --- [MainThread] vame.util.report : 264 : Saved community / motifs plot to /mnt/shared_storage/Github/VAME/examples/pipeline_example/reports/community_motifs_cohort_VAME_hmm-15.png 2024-12-31 10:59:14.172 INFO --- [MainThread] vame.util.report : 264 : Saved community / motifs plot to /mnt/shared_storage/Github/VAME/examples/pipeline_example/reports/community_motifs_single-mouse_EPM_video_VAME_hmm-15.png
Resuming the pipelineΒΆ
If for some reason you need to stop the pipeline, you can resume it later from any step Example: resuming from community clustering step
pipeline.run_pipeline(from_step=5)
2024-12-31 11:00:02.8 INFO --- [MainThread] vame.analysis.community_analysis : 198 : Zero motifs: [] 2024-12-31 11:00:02.236 INFO --- [MainThread] vame.analysis.community_analysis : 378 : Communities bag: 2024-12-31 11:00:02.237 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 0: [8] 2024-12-31 11:00:02.237 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 1: [10, 9, 7] 2024-12-31 11:00:02.237 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 2: [0, 3, 2, 14, 13] 2024-12-31 11:00:02.238 INFO --- [MainThread] vame.analysis.community_analysis : 380 : Community 3: [5, 6, 11, 4, 12, 1]