38 lines
1.6 KiB
Markdown
38 lines
1.6 KiB
Markdown
---
|
|
name: Refactor
|
|
about: Suggest a code refactor or deprecation
|
|
title: ''
|
|
labels: needs triage
|
|
assignees: ''
|
|
---
|
|
|
|
## Proposed refactor
|
|
|
|
<!-- A clear and concise description of the refactor -->
|
|
|
|
### Motivation
|
|
|
|
<!-- Please outline the motivation for the proposal. If this is related to another GitHub issue, please link it here -->
|
|
|
|
### Pitch
|
|
|
|
<!-- A clear and concise description of what you want to happen. -->
|
|
|
|
### Additional context
|
|
|
|
<!-- Add any other context or screenshots here. -->
|
|
|
|
______________________________________________________________________
|
|
|
|
#### If you enjoy Lightning, check out our other projects! ⚡
|
|
|
|
- [**Metrics**](https://github.com/Lightning-AI/metrics): Machine learning metrics for distributed, scalable PyTorch applications.
|
|
|
|
- [**Lite**](https://pytorch-lightning.readthedocs.io/en/latest/starter/lightning_lite.html): enables pure PyTorch users to scale their existing code on any kind of device while retaining full control over their own loops and optimization logic.
|
|
|
|
- [**Flash**](https://github.com/Lightning-AI/lightning-flash): The fastest way to get a Lightning baseline! A collection of tasks for fast prototyping, baselining, fine-tuning, and solving problems with deep learning.
|
|
|
|
- [**Bolts**](https://github.com/Lightning-AI/lightning-bolts): Pretrained SOTA Deep Learning models, callbacks, and more for research and production with PyTorch Lightning and PyTorch.
|
|
|
|
- [**Lightning Transformers**](https://github.com/Lightning-AI/lightning-transformers): Flexible interface for high-performance research using SOTA Transformers leveraging PyTorch Lightning, Transformers, and Hydra.
|