presto
Remote Sensing Transformer
A pre-trained transformer model designed to process and analyze remote sensing timeseries data
Lightweight, Pre-trained Transformers for Remote Sensing Timeseries
188 stars
9 watching
31 forks
Language: Python
last commit: 7 months ago
Linked from 1 awesome list
Related projects:
Repository | Description | Stars |
---|---|---|
nasa-impact/prithvi-wxc | A scalable 2D transformer model for predicting weather and climate patterns | 103 |
sertit/eoreader | A Python library for reading and processing remote-sensing data from various satellite sensors. | 287 |
nasaharvest/cropharvest | A remote sensing dataset with associated benchmarks and tools for training machine learning models. | 169 |
open-eo/openeo-python-client | A Python client library for interacting with the openEO API to access remote sensing data from various sources. | 155 |
esa-philab/opensartoolkit | A toolset for pre-processing and analyzing SAR data from Sentinel-1 satellites | 212 |
remotesensinginfo/arcsi | Automates atmospheric correction of satellite imagery to produce analysis-ready data | 35 |
ytarazona/scikit-eo | A Python package for analyzing and processing remote sensing data | 126 |
chrislemke/sk-transformers | Provides a collection of reusable data transformation tools | 8 |
leviswind/pytorch-transformer | Implementation of a transformer-based translation model in PyTorch | 239 |
bigscience-workshop/megatron-deepspeed | A collection of tools and scripts for training large transformer language models at scale | 1,335 |
prlz77/resnext.pytorch | Reproduces ResNet-V3 with PyTorch for computer vision tasks | 508 |
alexeypechnikov/pygmtsar | Software for processing satellite interferometry data from Sentinel-1 satellites | 430 |
jamesoconnor/sentinel_bot | A Twitter bot that processes and posts satellite images from a remote data source. | 18 |
kscottz/pythonfromspace | An introduction to using satellite imagery with Python for data analysis and visualization. | 453 |
microsoft/megatron-deepspeed | Research tool for training large transformer language models at scale | 1,895 |