andrewdalpino/NoPE-GPT
A GPT-style small language model (SLM) with no positional embeddings (NoPE).
This project offers a compact, open-source language model that can answer questions, summarize documents, and interact conversationally. It takes in text prompts or chat messages and generates human-like text responses. It's designed for machine learning engineers and researchers who need a flexible, performant, and memory-efficient foundation model to build upon or integrate into applications.
Available on PyPI.
Use this if you are a machine learning engineer or researcher looking for a small, efficient, and customizable language model to integrate into your applications or experiment with advanced architectural features.
Not ideal if you are an end-user simply looking to chat with an AI and do not have the technical expertise to deploy or fine-tune models.
Stars
8
Forks
2
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 11, 2026
Commits (30d)
0
Dependencies
3
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/llm-tools/andrewdalpino/NoPE-GPT"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
Nixtla/nixtla
TimeGPT-1: production ready pre-trained Time Series Foundation Model for forecasting and...
sigdelsanjog/gptmed
pip install gptmed
akanyaani/gpt-2-tensorflow2.0
OpenAI GPT2 pre-training and sequence prediction implementation in Tensorflow 2.0
samkamau81/FinGPT_
FinGPT is an AI language model designed to understand and generate financial content. Built upon...
VinAIResearch/PhoGPT
PhoGPT: Generative Pre-training for Vietnamese (2023)