somoclu

Map trainer

A software library for training self-organizing maps on large datasets using parallel computing techniques.

Massively parallel self-organizing maps: accelerate training on multicore CPUs, GPUs, and clusters

GitHub

268 stars
28 watching
70 forks
Language: C
last commit: 10 months ago
Linked from 1 awesome list


Backlinks from these awesome lists:

Related projects:

Repository Description Stars
avinashshenoy97/rusticsom A Rust library for training and applying Self Organising Maps (SOM) to machine learning tasks. 33
open-mmlab/mmengine Provides a flexible and configurable framework for training deep learning models with PyTorch. 1,179
wojzaremba/algorithm-learning A framework to learn simple algorithms from examples by generating and visualizing intermediate solutions 180
sevamoo/sompy A Python library implementing Self Organizing Map (SOM) algorithm for data analysis and visualization 535
soumith/imagenet-multigpu.torch A toolkit for training neural networks on the ImageNet dataset using multiple GPUs. 402
volcengine/vescale A PyTorch-based framework for training large language models in parallel on multiple devices 663
wyy-123-xyy/ra-fed A Python implementation of a distributed machine learning framework for training neural networks on multiple GPUs 6
wojciechmo/yolo2 Trains a YOLOv2 object detector from scratch using Tensorflow 138
moses-smt/nplm A toolkit for training neural network language models 14
mljs/som An implementation of a self-organizing map algorithm for mapping high-dimensional data to lower-dimensional spaces. 23
huggingface/nanotron A library for training large language models with parallel computing and mixed precision training methods 1,244
zekrotja/timedmap A package implementing a thread-safe map with expiring key-value pairs 72
raphaelquast/eomaps A Python package to create interactive maps of geographical datasets 339
publiclab/mapknitter A tool for combining and preparing aerial images for mapping purposes 267
layneh/self-adaptive-training Improves deep network generalization under noise and enhances self-supervised representation learning 127