Framework¶
Pauhu uses the DSPy Framework for AI programming.
Why DSPy?¶
DSPy is a framework for programming—not prompting—language models. It provides:
- Declarative signatures that define inputs and outputs
- Composable modules that chain operations
- Automatic optimization using training examples
- Air-gapped mode for offline and EU-compliant deployments
This transparency means you know exactly what framework powers Pauhu—and you can trust its research-backed approach from Stanford NLP.
Learn More¶
To understand the framework, visit the DSPy Documentation.
Research Papers¶
- [Oct'23] DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines
- [Jun'24] Optimizing Instructions and Demonstrations for Multi-Stage Language Model Programs
- [Jul'24] Fine-Tuning and Prompt Optimization: Two Great Steps that Work Better Together
- [Dec'23] DSPy Assertions: Computational Constraints for Self-Refining Language Model Pipelines
- [Dec'22] Demonstrate-Search-Predict: Composing Retrieval & Language Models for Knowledge-Intensive NLP
Follow @DSPyOSS for updates.
Citation¶
@inproceedings{khattab2024dspy,
title={DSPy: Compiling Declarative Language Model Calls into Self-Improving Pipelines},
author={Khattab, Omar and Singhvi, Arnav and Maheshwari, Paridhi and Zhang, Zhiyuan and Santhanam, Keshav and Vardhamanan, Sri and Haq, Saiful and Sharma, Ashutosh and Joshi, Thomas T. and Moazam, Hanna and Miller, Heather and Zaharia, Matei and Potts, Christopher},
journal={The Twelfth International Conference on Learning Representations},
year={2024}
}