Ecosyste.ms: Awesome

An open API service indexing awesome lists of open source software.

Awesome Lists | Featured Topics | Projects

https://github.com/sanikamal/audio-speech-recognition-atoz


https://github.com/sanikamal/audio-speech-recognition-atoz

Last synced: about 10 hours ago
JSON representation

Awesome Lists containing this project

README

        

# Audio & Speech Recognition AtoZ 🎧

Welcome to the **Audio & Speech Recognition AtoZ** repository! This project is a comprehensive guide to mastering audio and speech recognition using cutting-edge techniques.

## Installation 🚀

Follow these steps to get started:

1. **Clone the Repository**:
```bash
git clone https://github.com/sanikamal/audio-speech-recognition-atoz.git
cd audio-speech-recognition-atoz
```

2. **Install Dependencies**:
```bash
pip install -r requirements.txt
```

3. **Launch Jupyter Notebook**:
```bash
jupyter notebook
```

## Table of Contents 📚

| **Title**| **Description**| **Library/Technology**| **Link**| **Article/Blog Link**|
|----------|--------------------|---------------------|----------------|-------------|
| Analysis of Audio File | Read an audio file using Librosa, visualize the audio (waveform, spectrogram), interchange the axis, and write to files. | `Librosa`, `soundfie`,`Matplotlib`| [Link](notebooks/analysis-audio-file.ipynb) |--|
| Digitization and Recording of Speech | Record speech, digitize it, and use the `recognize_google` function from the `speech_recognition` library to convert audio to text. | `SpeechRecognition` | [Link](notebooks/Digitization-Recording-Speech.ipynb) | --- |

## Contributing 🤝

Contributions are welcome! If you'd like to contribute, please fork the repository and create a pull request. For major changes, please open an issue to discuss what you would like to change.

## License 📜

This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.

## Acknowledgements 🙏

A big thank you to all the developers and researchers whose tools and libraries were used. Special thanks to the open-source community for their valuable contributions.