https://github.com/neurodivergent-dev/zscore
A sophisticated web application for text analysis and Shannon Entropy calculation.
https://github.com/neurodivergent-dev/zscore
ai-assisted-development cognitive-ui cursor-ai developer-philosophy eslint information-theory language-models lexical-analysis netlify react recursive-prompting semantic-analysis shannon-entropy software-engineering tailwindcss textual-entropy thought-quantification typescript vite zscore
Last synced: 24 days ago
JSON representation
A sophisticated web application for text analysis and Shannon Entropy calculation.
- Host: GitHub
- URL: https://github.com/neurodivergent-dev/zscore
- Owner: neurodivergent-dev
- License: mit
- Created: 2025-05-02T03:43:55.000Z (11 months ago)
- Default Branch: main
- Last Pushed: 2025-07-09T07:47:01.000Z (9 months ago)
- Last Synced: 2026-02-17T21:54:47.582Z (about 1 month ago)
- Topics: ai-assisted-development, cognitive-ui, cursor-ai, developer-philosophy, eslint, information-theory, language-models, lexical-analysis, netlify, react, recursive-prompting, semantic-analysis, shannon-entropy, software-engineering, tailwindcss, textual-entropy, thought-quantification, typescript, vite, zscore
- Language: TypeScript
- Homepage: https://zscore-ai.netlify.app/
- Size: 1.59 MB
- Stars: 5
- Watchers: 1
- Forks: 0
- Open Issues: 0
-
Metadata Files:
- Readme: README.md
- License: LICENSE
Awesome Lists containing this project
README
# Shannon Entropy Calculator
A sophisticated web application for text analysis and Shannon entropy calculation. This application provides insights into text complexity, lexical diversity, and information content through statistical analysis.
[](https://app.netlify.com/sites/zscore-ai/deploys)
## ๐ Live Demo
๐ [Click here to try the Shannon Entropy Calculator](https://zscore-ai.netlify.app/)
## Features
- **Text Analysis**: Analyze any text input for statistical properties
- **Word Count**: Get the total number of words in the text
- **Unique Words**: Identify and count unique words in the text
- **Lexical Diversity**: Calculate the ratio of unique words to total words
- **Shannon Entropy**: Measure the information content or unpredictability of text
- **Word Frequency**: Visualize the frequency distribution of words
- **Analysis History**: Save and review past analyses with persistent storage
- **Multi-language Support**: Full internationalization with English and Turkish languages
- **Responsive Design**: Optimized for both desktop and mobile devices
- **Dark/Light Mode**: Toggle between light and dark themes for comfortable viewing
- **Persistent Settings**: User preferences for language and theme are saved between sessions
## Technology Stack
- **React 18+**: Modern UI library with functional components and hooks
- **TypeScript**: Full type safety throughout the project
- **Vite**: Lightning-fast build tool and development server
- **Tailwind CSS**: Utility-first CSS framework for responsive design
- **Zustand**: Simple state management without boilerplate
- **Framer Motion**: Animation library for smooth UI transitions
- **Intlayer**: Advanced, type-safe internationalization solution for React
- **Persistent Storage**: Local storage for saving user preferences and history
## Getting Started
### Prerequisites
Make sure you have the following installed:
- Node.js (v16 or later)
- npm (v7 or later)
### Installation
1. Clone the repository:
```bash
git clone https://github.com/melihcanndemir/ZScore.git
cd ZScore
```
2. Install dependencies:
```bash
npm install
```
3. Run the development server:
```bash
npm run dev
```
4. Open your browser and navigate to `http://localhost:5173`
## About Shannon Entropy
Shannon entropy is a measure of the average level of information or "surprise" inherent in a variable's possible outcomes. It quantifies the expected value of the information contained in a message and is measured in bits, nats, or bans depending on the logarithm base used.
### Formula
The Shannon entropy H(X) of a discrete random variable X is given by:
```
H(X) = -ฮฃ p(x) logโ p(x)
```
Where:
- p(x) is the probability mass function of X
- logโ is the binary logarithm (base 2)
In text analysis, this measures how unpredictable or informative the text is. Higher entropy indicates more diverse, unpredictable text, while lower entropy suggests more predictable, repetitive text.
## ๐ Internationalization
The application supports multiple languages:
- English (default)
- Turkish
Language settings are automatically saved and restored between sessions. You can switch languages using the language toggle in the navigation bar.
**Advanced Internationalization Guide:**
For a comprehensive, type-safe, and advanced i18n setup using Intlayer (React + Vite + TypeScript), see the [Intlayer Integration Guide](./intlayer.md).
## ๐ฑ Mobile Support
The application is fully responsive and optimized for mobile devices:
- Adaptive layout for different screen sizes
- Touch-friendly interface
- Hamburger menu for navigation on small screens
- Optimized reading experience on mobile devices
## ๐ ZScore Manifesto
Learn about the philosophy and design process behind this project.
๐ [Read the manifesto](./zscore-manifesto.md)
## ๐งช Try These Samples
- "The sky above the port was the color of television, tuned to a dead channel."
- "All human beings are born free and equal in dignity and rights."
- "AAAAAAAAAAAAAAA"
## ๐ผ Preview

## License
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
## Acknowledgments
- Claude Shannon for his groundbreaking work in information theory
- The React team for their excellent framework
- The open-source community for providing amazing tools
- Contributors who helped improve the project