{"id":29097380,"url":"https://github.com/microsoftcloudessentials-learninghub/pdfs-invoice-processing-fapp-openframework","last_synced_at":"2026-02-02T22:06:13.561Z","repository":{"id":294306907,"uuid":"986560725","full_name":"MicrosoftCloudEssentials-LearningHub/PDFs-Invoice-Processing-Fapp-OpenFramework","owner":"MicrosoftCloudEssentials-LearningHub","description":"Example of how to create to extract PDFs from an Azure Storage Account, process them using Azure resources and Open Framework, and store the results in Cosmos DB for further analysis.","archived":false,"fork":false,"pushed_at":"2025-10-29T18:35:29.000Z","size":123,"stargazers_count":0,"open_issues_count":0,"forks_count":0,"subscribers_count":0,"default_branch":"main","last_synced_at":"2025-10-29T20:45:36.839Z","etag":null,"topics":["ai-use-cases-strategy","automated-extraction","cosmos-db","full-code","function-app","open-frameworks","pdf-invoice","storage-account"],"latest_commit_sha":null,"homepage":"","language":"HCL","has_issues":true,"has_wiki":null,"has_pages":null,"mirror_url":null,"source_name":null,"license":"mit","status":null,"scm":"git","pull_requests_enabled":true,"icon_url":"https://github.com/MicrosoftCloudEssentials-LearningHub.png","metadata":{"files":{"readme":"README.md","changelog":null,"contributing":null,"funding":null,"license":"LICENSE","code_of_conduct":null,"threat_model":null,"audit":null,"citation":null,"codeowners":null,"security":null,"support":null,"governance":null,"roadmap":null,"authors":null,"dei":null,"publiccode":null,"codemeta":null,"zenodo":null,"notice":null,"maintainers":null,"copyright":null,"agents":null,"dco":null,"cla":null}},"created_at":"2025-05-19T19:46:52.000Z","updated_at":"2025-10-29T18:34:21.000Z","dependencies_parsed_at":"2025-05-19T20:55:39.456Z","dependency_job_id":"671550c3-d32d-415c-ae15-0f8aee48d775","html_url":"https://github.com/MicrosoftCloudEssentials-LearningHub/PDFs-Invoice-Processing-Fapp-OpenFramework","commit_stats":null,"previous_names":["microsoftcloudessentials-learninghub/pdf-processing-fapp-openframework","microsoftcloudessentials-learninghub/pdfs-invoice-processing-fapp-openframework"],"tags_count":0,"template":false,"template_full_name":null,"purl":"pkg:github/MicrosoftCloudEssentials-LearningHub/PDFs-Invoice-Processing-Fapp-OpenFramework","repository_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MicrosoftCloudEssentials-LearningHub%2FPDFs-Invoice-Processing-Fapp-OpenFramework","tags_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MicrosoftCloudEssentials-LearningHub%2FPDFs-Invoice-Processing-Fapp-OpenFramework/tags","releases_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MicrosoftCloudEssentials-LearningHub%2FPDFs-Invoice-Processing-Fapp-OpenFramework/releases","manifests_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MicrosoftCloudEssentials-LearningHub%2FPDFs-Invoice-Processing-Fapp-OpenFramework/manifests","owner_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners/MicrosoftCloudEssentials-LearningHub","download_url":"https://codeload.github.com/MicrosoftCloudEssentials-LearningHub/PDFs-Invoice-Processing-Fapp-OpenFramework/tar.gz/refs/heads/main","sbom_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/MicrosoftCloudEssentials-LearningHub%2FPDFs-Invoice-Processing-Fapp-OpenFramework/sbom","scorecard":null,"host":{"name":"GitHub","url":"https://github.com","kind":"github","repositories_count":286080680,"owners_count":28756442,"icon_url":"https://github.com/github.png","version":null,"created_at":"2022-05-30T11:31:42.601Z","updated_at":"2026-01-25T16:32:25.380Z","status":"ssl_error","status_checked_at":"2026-01-25T16:32:09.189Z","response_time":113,"last_error":"SSL_read: unexpected eof while reading","robots_txt_status":"success","robots_txt_updated_at":"2025-07-24T06:49:26.215Z","robots_txt_url":"https://github.com/robots.txt","online":false,"can_crawl_api":true,"host_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub","repositories_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories","repository_names_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/repository_names","owners_url":"https://repos.ecosyste.ms/api/v1/hosts/GitHub/owners"}},"keywords":["ai-use-cases-strategy","automated-extraction","cosmos-db","full-code","function-app","open-frameworks","pdf-invoice","storage-account"],"created_at":"2025-06-28T13:42:03.985Z","updated_at":"2026-01-25T18:31:53.824Z","avatar_url":"https://github.com/MicrosoftCloudEssentials-LearningHub.png","language":"HCL","readme":"# Demo: Automated PDF Invoice Processing with Open Framework (full-code approach)\n\n`Azure Storage + Function App + Open Framework +  Cosmos DB`\n\nCosta Rica\n\n[![GitHub](https://img.shields.io/badge/--181717?logo=github\u0026logoColor=ffffff)](https://github.com/)\n[brown9804](https://github.com/brown9804)\n\nLast updated: 2025-09-17\n\n----------\n\n\u003e [!IMPORTANT]\n\u003e This example is based on a `public network site and is intended for demonstration purposes only`. It showcases how several Azure resources can work together to achieve the desired result. Consider the section below about [Important Considerations for Production Environment](#important-considerations-for-production-environment). Please note that `these demos are intended as a guide and are based on my personal experiences. For official guidance, support, or more detailed information, please refer to Microsoft's official documentation or contact Microsoft directly`: [Microsoft Sales and Support](https://support.microsoft.com/contactus?ContactUsExperienceEntryPointAssetId=S.HP.SMC-HOME)\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eList of References\u003c/b\u003e (Click to expand)\u003c/summary\u003e\n\n- [Power Apps pricing](https://www.microsoft.com/en-us/power-platform/products/power-apps/)\n- [Create a Fabric data agent (preview)](https://learn.microsoft.com/en-us/fabric/data-science/how-to-create-data-agent) `enables users to interact with data stored in lakehouses, warehouses, Power BI semantic models, and KQL databases using natural language queries`. Key prerequisites include having a paid Fabric capacity, enabling specific tenant settings, and ensuring data sources are accessible.\n  - To create a data agent, users navigate to their workspace, select the data agent option, and configure it by adding `up to five data sources`. Users can ask questions in plain English, and the system translates these into structured queries (SQL, DAX, KQL) to retrieve data. The document outlines the process for creating, configuring, and sharing the data agent, including providing instructions and example queries to enhance performance.\n  - The data agent operates under the user’s Microsoft Entra ID permissions, ensuring secure access to data. `It does not support complex reasoning or advanced analytics, focusing instead on retrieving structured data based on user queries`. Users can publish their data agent for colleagues to use after testing its performance. We can extend this with [Extend your agent with Model Context Protocol](https://learn.microsoft.com/en-us/microsoft-copilot-studio/agent-extend-action-mcp) allows users to connect to existing knowledge servers and data sources, providing access to resources, tools, and predefined prompts for specific tasks.\n\n      \u003cimg width=\"1898\" height=\"995\" alt=\"image\" src=\"https://github.com/user-attachments/assets/3845e948-71be-4ce2-9eb6-94e52e51acde\" /\u003e\n\n- [Document Processor](https://learn.microsoft.com/en-us/microsoft-copilot-studio/template-managed-document-processor) managed agent in Microsoft Copilot Studio. `E2E solution for document processing, including extraction, validation, human monitoring, and exporting to downstream applications. Users can upload a sample document and configure extraction without needing to label data or train custom models. The agent informs users of the processing status and allows for manual verification of extracted data in the Validation Station`. Key prerequisites include licenses for Copilot Studio and Power Platform, enabling the Power Apps component framework, and specific security roles and permissions. Limitations on document processing include file size (less than 25 MB), file types (PNG, JPG, JPEG, PDF), and a maximum of 50 pages. Users can set up the agent by creating connections with required services, configuring data fields for extraction, and creating validation rules. The agent can notify reviewers when validation fails, and users can interact with the agent via Microsoft Teams. Here is more about `setting up the agent, including uploading sample documents, defining validation rules, and selecting document sources.` [Use an autonomous agent in Copilot Studio for document processing](https://learn.microsoft.com/en-us/power-platform/architecture/reference-architectures/document-processing-agent)\n\n    \u003cimg width=\"1332\" height=\"844\" alt=\"image\" src=\"https://github.com/user-attachments/assets/879c048d-1508-4ee1-a62f-1e1593b74f59\" /\u003e\n\n- [Solution Accelerator for AI Document Processor (ADP)](https://github.com/azure/ai-document-processor) - AI Factory\n\n    | **Category**                  | **Details**                                                                                                                                                                                                 |\n    |------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|\n    | **Purpose**                  | Automate document processing using Azure services and LLMs. Extracts data from files (PDF, Word, MP3), processes via Azure OpenAI, and outputs structured insights (JSON/CSV) to blob storage.         |\n    | **Infrastructure Provisioning** | Uses Bicep templates to deploy all required Azure resources. Automates setup of networking, identity, and access controls using RBAC and managed identities to ensure secure and scalable deployment. |\n    | **Main Azure Resources**     | - **Azure Function App**: Hosts the orchestrator and activity functions using Durable Functions to manage the document processing workflow.\u003cbr\u003e- **Azure Storage Account**: Stores input documents (bronze container) and output results (gold container). Also holds prompt configurations if UI is not deployed.\u003cbr\u003e- **Azure Static Web App**: Provides a user-friendly interface for uploading files, editing prompts, and triggering workflows.\u003cbr\u003e- **Azure OpenAI**: Processes extracted text using LLMs to generate structured summaries or insights.\u003cbr\u003e- **Azure Cognitive Services**: Specifically uses Document Intelligence for OCR and text extraction from uploaded files.\u003cbr\u003e- **Cosmos DB**: Stores prompt configurations when the frontend UI is enabled, allowing dynamic updates from the web interface.\u003cbr\u003e- **Key Vault**: Securely stores secrets, keys, and credentials used by the Function App and other services.\u003cbr\u003e- **Application Insights**: Enables monitoring, logging, and diagnostics for the Function App and other components.\u003cbr\u003e- **App Service Plan**: Provides the compute resources for running the Function App.                                                                 |\n    | **Pipeline Components**      | - `function_app.py`: Main orchestrator using Durable Functions chaining pattern.\u003cbr\u003e- `activities/runDocIntel.py`: Extracts text from documents using Azure Document Intelligence.\u003cbr\u003e- `activities/callAoai.py`: Sends extracted text and prompt to Azure OpenAI and receives structured JSON.\u003cbr\u003e- `activities/writeToBlob.py`: Writes the final output to the gold container in blob storage. |\n    | **Data Flow**                | 1. Upload document to bronze container\u003cbr\u003e2. OCR via Document Intelligence\u003cbr\u003e3. Send extracted text + prompt to Azure OpenAI\u003cbr\u003e4. Receive structured JSON\u003cbr\u003e5. Write output to gold container           |\n    | **Frontend UI (Optional)**  | - Allows business users to upload files and edit prompts\u003cbr\u003e- Prompts are stored in Cosmos DB\u003cbr\u003e- Users can trigger workflows and view job status directly from the interface                             |\n    | **Prompt Configuration**     | - Without UI: Prompts are stored in `prompts.yaml` file in blob storage\u003cbr\u003e- With UI: Prompts are stored and managed in Cosmos DB via the web interface                                                  |\n    | **Deployment Steps**         | 1. Fork and clone the GitHub repo\u003cbr\u003e2. Run `az login`, `azd auth login`, `azd up`\u003cbr\u003e3. Provide User Principal ID for RBAC setup\u003cbr\u003e4. Choose whether to deploy frontend UI                              |\n    | **Execution (Without UI)**   | - Update `prompts.yaml` with desired instructions\u003cbr\u003e- Send POST request to `http_start` endpoint with blob metadata\u003cbr\u003e- Monitor pipeline execution via Log Stream                                       |\n    | **Execution (With UI)**      | - Upload files via web interface\u003cbr\u003e- Edit system and user prompts\u003cbr\u003e- Click \"Start Workflow\" to trigger pipeline\u003cbr\u003e- View success/failure messages and job status                                     |\n    | **Monitoring \u0026 Troubleshooting** | - Use Log Stream for real-time logs\u003cbr\u003e- Use Log Analytics Workspace to query exceptions and performance metrics\u003cbr\u003e- Use SSH console in Development Tools to inspect deployment logs and file system |\n    | **Pre-Requisites**           | - Azure CLI\u003cbr\u003e- Azure Developer CLI (azd)\u003cbr\u003e- Node.js 18.x.x\u003cbr\u003e- npm 9.x.x\u003cbr\u003e- Python 3.11                                                                                                             |\n    | **License**                  | MIT License – Free to use, modify, and distribute with attribution. No warranty provided.                                                                                                                  |\n\n    \u003cimg width=\"835\" height=\"535\" alt=\"image\" src=\"https://github.com/user-attachments/assets/61dbac57-f635-4dd6-9292-50e51823a8c4\" /\u003e\n    \n    \u003e Data flow: \n    \n    \u003cimg width=\"930\" height=\"620\" alt=\"image\" src=\"https://github.com/user-attachments/assets/2f01a07b-71cd-4ee9-b316-c0a2273d01b2\" /\u003e\n\n    \u003e ZTA:\n    \u003cimg width=\"1840\" height=\"935\" alt=\"image\" src=\"https://github.com/user-attachments/assets/947ddf0d-daf3-4df3-949c-1271a3fef7bb\" /\u003e\n\n- [Use Azure AI services with SynapseML in Microsoft Fabric](https://learn.microsoft.com/en-us/fabric/data-science/how-to-use-ai-services-with-synapseml)\n- [Plan and manage costs for Azure AI Foundry](https://learn.microsoft.com/en-us/azure/ai-foundry/how-to/costs-plan-manage)\n- [Azure Cosmos DB - Database for the AI Era](https://learn.microsoft.com/en-us/azure/cosmos-db/introduction)\n- [What is Azure SQL Database?](https://learn.microsoft.com/en-us/azure/azure-sql/database/sql-database-paas-overview?view=azuresql)\n- [App settings reference for Azure Functions](https://learn.microsoft.com/en-us/azure/azure-functions/functions-app-settings)\n- [Storage considerations for Azure Functions](https://learn.microsoft.com/en-us/azure/azure-functions/storage-considerations?tabs=azure-cli)\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003eTable of Content\u003c/b\u003e (Click to expand)\u003c/summary\u003e\n\n- [Prerequisites](#prerequisites)\n- [Where to start?](#where-to-start)\n- [Important Considerations for Production Environment](#important-considerations-for-production-environment)\n- [Overview](#overview)\n- [Function App Hosting Options](#function-app-hosting-options)\n- [Step 1: Set Up Your Azure Environment](#step-1-set-up-your-azure-environment)\n- [Step 2: Set Up Azure Blob Storage for PDF Ingestion](#step-2-set-up-azure-blob-storage-for-pdf-ingestion)\n- [Step 3: Set Up Azure Cosmos DB](#step-3-set-up-azure-cosmos-db)\n- [Step 4: Set Up Azure Functions for Document Ingestion and Processing](#step-4-set-up-azure-functions-for-document-ingestion-and-processing)\n  - [Create a Function App](#create-a-function-app)\n  - [Configure/Validate the Environment variables](#configurevalidate-the-environment-variables)\n  - [Develop the Function](#develop-the-function)\n- [Step 5: Test the solution](#step-5-test-the-solution)\n\n\u003c/details\u003e\n\n\u003e `How we move from basic coding all the way to AI agents?`\n\n```mermaid\nflowchart LR\n    A[Scripting: Line-by-line instructions] --\u003e B[Machine Learning: Packages + statistical foundations]\n    B --\u003e C[LLMs: Reasoning, understanding, human-like responses]\n    C --\u003e D[Agents: LLMs with ability to act]\n\n    %% Styling\n    classDef step fill:#4a90e2,stroke:#333,stroke-width:2px,color:#fff,font-weight:bold;\n    class A,B,C,D step;\n\n    %% Extra notes\n    A:::step\n    B:::step\n    C:::step\n    D:::step\n```\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cb\u003e More details about it here \u003c/b\u003e (Click to expand)\u003c/summary\u003e\n  \n\u003e - We all `start with scripting`, no matter the language, it’s the first step. `Simple/complex instructions, written line by line`, to get something done\n\u003e - Then comes `machine learning`. At this stage, we’re not reinventing the math, we’re `leveraging powerful packages built on deep statistical and mathematical foundations.` These tools let us `automate smarter processes, like reviewing claims with predictive analytics. You’re not just coding anymore; you’re building systems that learn and adapt.`\n\u003e - `LLMs`. This is what most people mean when they say `AI.` Think of `yourself as the architect, and the LLM as your strategic engine. You can plug into it via an API, a key, or through integrated services. It’s not just about automation, it’s about reasoning, understanding, and generating human-like responses.`\n\u003e - And finally, `agents`. These are LLMs with the `ability to act`. They don’t just respond, `they take initiative. They can create code, trigger workflows, make decisions, interact with tools, with other agents. It’s where intelligence meets execution`\n\n\u003c/details\u003e\n\n\u003e [!NOTE]\n\u003e A landing zone is a general `cloud framework that sets up the core structure for all workloads`. Each use case (like an app, data pipeline, or API) then builds on top of this framework, using the `same environments (Dev → Test → UAT → Prod) and CI/CD pipelines to move code safely into production.` It’s general by design, but `applied per use case.`\n\n\u003e How to parse PDFs from an Azure Storage Account, process them using a Open Framework (needs manual configuration), and store the results in Cosmos DB for further analysis. \u003cbr/\u003e \u003cbr/\u003e\n\u003e\n\u003e 1. Upload your PDFs to an Azure Blob Storage container. \u003cbr/\u003e\n\u003e 2. An Azure Function is triggered by the upload, which uses an Open Framework, and multiple customizations as part of an API call to analyze the PDFs.  \u003cbr/\u003e\n\u003e 3. The extracted data is parsed and subsequently stored in a Cosmos DB database, ensuring a seamless and automated workflow from document upload to data storage. \n\n\u003e [!NOTE]\n\u003e Limitations of this approach: \u003cbr/\u003e\n\u003e\n\u003e - Requires significant manual effort to structure and format extracted data. \u003cbr/\u003e\n\u003e - Limited in handling complex layouts and non-text elements like images and charts. \u003cbr/\u003e\n\n\u003cdiv align=\"center\"\u003e\n  \u003cimg src=\"https://github.com/user-attachments/assets/cda874fc-6cca-4857-ac5d-2f8d7887e36d\" alt=\"Centered Image\" style=\"border: 2px solid #4CAF50; border-radius: 5px; padding: 5px;\"/\u003e\n\u003c/div\u003e\n\n## Important Considerations for Production Environment\n\n\u003cdetails\u003e\n  \u003csummary\u003ePrivate Network Configuration\u003c/summary\u003e\n\n \u003e For enhanced security, consider configuring your Azure resources to operate within a private network. This can be achieved using Azure Virtual Network (VNet) to isolate your resources and control inbound and outbound traffic. Implementing private endpoints for services like Azure Blob Storage and Azure Functions can further secure your data by restricting access to your VNet.\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n  \u003csummary\u003eSecurity\u003c/summary\u003e\n\n  \u003e Ensure that you implement appropriate security measures when deploying this solution in a production environment. This includes: \u003cbr/\u003e\n  \u003e\n  \u003e - Securing Access: Use Azure Entra ID (formerly known as Azure Active Directory or Azure AD) for authentication and role-based access control (RBAC) to manage permissions. \u003cbr/\u003e\n  \u003e - Managing Secrets: Store sensitive information such as connection strings and API keys in Azure Key Vault. \u003cbr/\u003e\n  \u003e - Data Encryption: Enable encryption for data at rest and in transit to protect sensitive information.\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n  \u003csummary\u003eScalability\u003c/summary\u003e\n\n  \u003e While this example provides a basic setup, you may need to scale the resources based on your specific requirements. Azure services offer various scaling options to handle increased workloads. Consider using: \u003cbr/\u003e\n  \u003e\n  \u003e - Auto-scaling: Configure auto-scaling for Azure Functions and other services to automatically adjust based on demand. \u003cbr/\u003e\n  \u003e - Load Balancing: Use Azure Load Balancer or Application Gateway to distribute traffic and ensure high availability.\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n  \u003csummary\u003eCost Management\u003c/summary\u003e\n\n  \u003e Monitor and manage the costs associated with your Azure resources. Use Azure Cost Management and Billing to track usage and optimize resource allocation.\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n  \u003csummary\u003eCompliance\u003c/summary\u003e\n\n  \u003e Ensure that your deployment complies with relevant regulations and standards. Use Azure Policy to enforce compliance and governance policies across your resources.\n\u003c/details\u003e\n\n\u003cdetails\u003e\n  \u003csummary\u003eDisaster Recovery\u003c/summary\u003e\n   \n\u003e Implement a disaster recovery plan to ensure business continuity in case of failures. Use Azure Site Recovery and backup solutions to protect your data and applications.\n\n\u003c/details\u003e\n\n## Prerequisites\n\n- An `Azure subscription is required`. All other resources, including instructions for creating a Resource Group, are provided in this workshop.\n- `Contributor role assigned or any custom role that allows`: access to manage all resources, and the ability to deploy resources within subscription.\n- If you choose to use the Terraform approach, please ensure that:\n  - [Terraform is installed on your local machine](https://developer.hashicorp.com/terraform/tutorials/azure-get-started/install-cli#install-terraform).\n  - [Install the Azure CLI](https://learn.microsoft.com/en-us/cli/azure/install-azure-cli) to work with both Terraform and Azure commands.\n\n## Where to start? \n\n\u003e Please follow as described below.\n\n- If you're choosing the `Infrastructure via Azure Portal`, please start [here in this section](#step-1-set-up-your-azure-environment).\n- If you're choosing the `Infrastructure via Terraform` approach:\n    1. Please follow the [Terraform guide](./terraform-infrastructure/) to deploy the necessary Azure resources for the workshop.\n    2. Then, follow each [each section](#step-1-set-up-your-azure-environment) but `skip the creation of each resource`.\n\n\u003e [!IMPORTANT]\n\u003e Regarding `Networking`, this example will cover `Public access configuration`, and `system-managed identity`. However, please ensure you `review your privacy requirements and adjust network and access settings as necessary for your specific case`.\n\n## Overview \n\n\u003e Using Cosmos DB provides you with a flexible, scalable, and globally distributed database solution that can handle both structured and semi-structured data efficiently. \u003cbr/\u003e\n\u003e\n\u003e - `Azure Blob Storage`: Store the PDF invoices. \u003cbr/\u003e\n\u003e - `Azure Functions`: Trigger on new PDF uploads, extract data, and process it. \u003cbr/\u003e\n\u003e - `Azure SQL Database or Cosmos DB`: Store the extracted data for querying and analytics. \u003cbr/\u003e \n\n| Resource                  | Recommendation                                                                                                      |\n|---------------------------|----------------------------------------------------------------------------------------------------------------------|\n| **Azure Blob Storage**    | Use for storing the PDF files. This keeps your file storage separate from your data storage, which is a common best practice. |\n| **Azure SQL Database**    | Use if your data is highly structured and you need complex queries and transactions.                                  |\n| **Azure Cosmos DB**       | Use if you need a globally distributed database with low latency and the ability to handle semi-structured data.      |\n\n## Function App Hosting Options \n\n\u003e In the context of Azure Function Apps, a `hosting option refers to the plan you choose to run your function app`. This choice affects how your function app is scaled, the resources available to each function app instance, and the support for advanced functionalities like virtual network connectivity and container support.\n\n\u003e [!TIP]  \n\u003e\n\u003e - `Scale to Zero`: Indicates whether the service can automatically scale down to zero instances when idle.  \n\u003e   - **IDLE** stands for:  \n\u003e     - **I** – Inactive  \n\u003e     - **D** – During  \n\u003e     - **L** – Low  \n\u003e     - **E** – Engagement  \n\u003e   - In other words, when the application is not actively handling requests or events (it's in a low-activity or paused state).\n\u003e - `Scale Behavior`: Describes how the service scales (e.g., `event-driven`, `dedicated`, or `containerized`).  \n\u003e - `Virtual Networking`: Whether the service supports integration with virtual networks for secure communication.  \n\u003e - `Dedicated Compute \u0026 Reserved Cold Start`: Availability of always-on compute to avoid cold starts and ensure low latency.  \n\u003e - `Max Scale Out (Instances)`: Maximum number of instances the service can scale out to.  \n\u003e - `Example AI Use Cases`: Real-world scenarios where each plan excels.\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cstrong\u003eFlex Consumption\u003c/strong\u003e\u003c/summary\u003e\n\n| Feature | Description |\n|--------|-------------|\n| **Scale to Zero** | `Yes` |\n| **Scale Behavior** | `Fast event-driven` |\n| **Virtual Networking** | `Optional` |\n| **Dedicated Compute \u0026 Reserved Cold Start** | `Optional (Always Ready)` |\n| **Max Scale Out (Instances)** | `1000` |\n| **Example AI Use Cases** | `Real-time data processing` for AI models, `high-traffic AI-powered APIs`, `event-driven AI microservices`. Ideal for fraud detection, real-time recommendations, NLP, and computer vision services. |\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cstrong\u003eConsumption\u003c/strong\u003e\u003c/summary\u003e\n\n| Feature | Description |\n|--------|-------------|\n| **Scale to Zero** | `Yes` |\n| **Scale Behavior** | `Event-driven` |\n| **Virtual Networking** | `Optional` |\n| **Dedicated Compute \u0026 Reserved Cold Start** | `No` |\n| **Max Scale Out (Instances)** | `200` |\n| **Example AI Use Cases** | `Lightweight AI APIs`, `scheduled AI tasks`, `low-traffic AI event processing`. Great for sentiment analysis, simple image recognition, and batch ML tasks. |\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cstrong\u003eFunctions Premium\u003c/strong\u003e\u003c/summary\u003e\n\n| Feature | Description |\n|--------|-------------|\n| **Scale to Zero** | `No` |\n| **Scale Behavior** | `Event-driven with premium options` |\n| **Virtual Networking** | `Yes` |\n| **Dedicated Compute \u0026 Reserved Cold Start** | `Yes` |\n| **Max Scale Out (Instances)** | `100` |\n| **Example AI Use Cases** | `Enterprise AI applications`, `low-latency AI APIs`, `VNet integration`. Ideal for secure, high-performance AI services like customer support and analytics. |\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cstrong\u003eApp Service\u003c/strong\u003e\u003c/summary\u003e\n\n| Feature | Description |\n|--------|-------------|\n| **Scale to Zero** | `No` |\n| **Scale Behavior** | `Dedicated VMs` |\n| **Virtual Networking** | `Yes` |\n| **Dedicated Compute \u0026 Reserved Cold Start** | `Yes` |\n| **Max Scale Out (Instances)** | `Varies` |\n| **Example AI Use Cases** | `AI-powered web applications`, `dedicated resources`. Great for chatbots, personalized content, and intensive AI inference. |\n\n\u003c/details\u003e\n\n\u003cdetails\u003e\n\u003csummary\u003e\u003cstrong\u003eContainer Apps Env.\u003c/strong\u003e\u003c/summary\u003e\n\n| Feature | Description |\n|--------|-------------|\n| **Scale to Zero** | `No` |\n| **Scale Behavior** | `Containerized microservices environment` |\n| **Virtual Networking** | `Yes` |\n| **Dedicated Compute \u0026 Reserved Cold Start** | `Yes` |\n| **Max Scale Out (Instances)** | `Varies` |\n| **Example AI Use Cases** | `AI microservices architecture`, `containerized AI workloads`, `complex AI workflows`. Ideal for orchestrating AI services like image processing, text analysis, and real-time analytics. |\n\n\u003c/details\u003e\n\n## Step 1: Set Up Your Azure Environment\n\n\u003e An Azure `Resource Group` is a `container that holds related resources for an Azure solution`.\n\u003e It can include all the resources for the solution or only those you want to manage as a group.\n\u003e Typically, resources that share the same lifecycle are added to the same resource group, allowing for easier deployment, updating, and deletion as a unit.\n\u003e Resource groups also store metadata about the resources, and you can apply access control, locks, and tags to them for better management and organization.\n\n1. **Create an Azure Account**: If you don't have one, sign up for an Azure account.\n2. **Create a Resource Group**:\n   - Go to the Azure portal.\n   - Navigate to **Resource groups**.\n   - Click **+ Create**.\n\n       \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/56d1e99f-0a22-4492-bd6f-d4e3a76aedd8\"\u003e\n\n   - Enter the Resource Group name (e.g., `RGContosoAI`) and select a region (e.g., `East US 2`). You can add tags if needed.\n   - Click **Review + create** and then **Create**.\n\n       \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/324c1157-9566-4b30-bb36-bd0efb0a1bf3\"\u003e\n\n## Step 2: Set Up Azure Blob Storage for PDF Ingestion\n\n\u003e An `Azure Storage Account` provides a `unique namespace in Azure for your data, allowing you to store and manage various types of data such as blobs, files, queues, and tables`. It serves as the foundation for all Azure Storage services, ensuring high availability, scalability, and security for your data. \u003cbr/\u003e \u003cbr/\u003e\n\u003e A `Blob Container` is a `logical grouping of blobs within an Azure Storage Account, similar to a directory in a file system`. Containers help organize and manage blobs, which can be any type of unstructured data like text or binary data. Each container can store an unlimited number of blobs, and you must create a container before uploading any blobs.\n\n1. **Create a Storage Account**:\n   - In the Azure portal, navigate to your **Resource Group**.\n   - Click **+ Create**.\n\n      \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/dd4579b3-2f95-4a24-b9ef-178ee14c9e98\"\u003e\n\n   - Search for `Storage Account`.\n  \n       \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/09616373-c3b2-459c-89b8-59c6db6beaea\"\u003e\n\n   - Select the Resource Group you created.\n   - Enter a Storage Account name (e.g., `contosostorageaidemo`).\n   - Choose the region and performance options, and click `Next` to continue.\n\n        \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/4db31956-2d36-4581-98cf-ec0e68d55037\"\u003e\n\n   - If you need to modify anything related to `Security, Access protocols, Blob Storage Tier`, you can do that in the `Advanced` tab.\n\n        \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/5d3da139-6e7a-4bb6-a695-deb1c314ccd3\"\u003e\n\n   - Regarding `Networking`, this example will cover `Public access` configuration. However, please ensure you review your privacy requirements and adjust network and access settings as necessary for your specific case.\n  \n       \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/0273e197-6e5b-4a1c-93cc-7597730c384b\"\u003e\n\n   - Click **Review + create** and then **Create**. Once is done, you'll be able to see it in your Resource Group.\n\n        \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/8b61b79c-9f3f-47f0-b59f-6720edebe41e\"\u003e\n\n2. **Create a Blob Container**:\n   - Go to your Storage Account.\n   - Under **Data storage**, select **Containers**.\n   - Click **+ Container**.\n   - Enter a name for the container (e.g., `pdfinvoices`) and set the public access level to **Private**.\n   - Click **Create**.\n\n        \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/9b4900e3-7ce8-42aa-b5d9-c2fbb2417721\"\u003e\n\n## Step 3: Set Up Azure Cosmos DB\n\n\u003e `Azure Cosmos DB` is a globally distributed,`multi-model database service provided by Microsoft Azure`. It is designed to offer high availability, scalability, and low-latency access to data for modern applications. Unlike traditional relational databases, Cosmos DB is a `NoSQL database, meaning it can handle unstructured, semi-structured, and structured data types`. `It supports multiple data models, including document, key-value, graph, and column-family, making it versatile for various use cases.` \u003cbr/\u003e \u003cbr/\u003e\n\u003e An `Azure Cosmos DB container` is a `logical unit` within a Cosmos DB database where data is stored. `Containers are schema-agnostic, meaning they can store items with different structures. Each container is automatically partitioned to scale out across multiple servers, providing virtually unlimited throughput and storage`. Containers are the primary scalability unit in Cosmos DB, and they use a partition key to distribute data efficiently across partitions.\n\n1. **Create a Cosmos DB Account**:\n   - In the Azure portal, navigate to your **Resource Group**.\n   - Click **+ Create**.\n   - Search for `Cosmos DB`, click on `Create`:\n     \n      \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/ecdb9a17-5623-4dc0-a607-92448950b7a0\"\u003e\n\n   - Choose your desired API type, for this will be using `Azure Cosmos DB for NoSQL`. This option supports a SQL-like query language, which is familiar and powerful for querying and analyzing your invoice data. It also integrates well with various client libraries, making development easier and more flexible.\n\n       \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/db942359-8a81-4289-9ea7-91234b4c3802\"\u003e\n\n   - Please enter an account name (e.g., `contosocosmosdbaidemo`). As with the previously configured resources, we will use the `Public network` for this example. Ensure that you adjust the architecture to include your networking requirements.\n   - Select the region and other settings.\n   - Click **Review + create** and then **Create**.\n\n       \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/42b415d3-0d38-4b69-9e18-7bc4015b4a6d\"\u003e\n\n2. **Create a Database and Container**:\n   - Go to your Cosmos DB account.\n   - Under **Data Explorer**, click **New Database**.\n\n      \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/5f816576-8160-444c-8abc-086b450d98b1\"\u003e\n\n   - Enter a database name (e.g., `ContosoDBAIDemo`) and click **OK**.\n\n      \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/5dcb8d28-b042-4038-ac37-2663b8013a3a\"\u003e\n\n   - Click **New Container**.\n   - Enter a container name (e.g., `Invoices`) and set the partition key (e.g., `/invoice_number`).\n   - Click **OK**.\n\n      \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/0232de53-ee75-4f20-a45d-49cf54e3f794\"\u003e\n\n      \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/50fc8358-c33d-436d-9661-4127465fc21b\"\u003e\n\n## Step 4: Set Up Azure Functions for Document Ingestion and Processing\n\n\u003e An `Azure Function App` is a `container for hosting individual Azure Functions`. It provides the execution context for your functions, allowing you to manage, deploy, and scale them together. `Each function app can host multiple functions, which are small pieces of code that run in response to various triggers or events, such as HTTP requests, timers, or messages from other Azure services`. \u003cbr/\u003e \u003cbr/\u003e\n\u003e Azure Functions are designed to be lightweight and event-driven, enabling you to build scalable and serverless applications. `You only pay for the resources your functions consume while they are running, making it a cost-effective solution for many scenarios`.\n\n### Create a Function App\n\n- In the Azure portal, go to your **Resource Group**.\n- Click **+ Create**.\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/efac220b-72db-447b-98a7-58196b3d39dd\"\u003e\n\n- Search for `Function App`, click on `Create`:\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/571c2880-cff7-4ed5-9840-2f1b5f58ce46\"\u003e\n\n- Choose a `hosting option`; for this example, we will use `Consumption`. Click [here for a quick overview of hosting options](https://github.com/brown9804/MicrosoftCloudEssentialsHub/tree/main/0_Azure/3_AzureAI/14_AIUseCases/0_PDFProcessingFAOF#function-app-hosting-options):\n       \n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/a8bd30c5-7b21-4aac-adf2-6e1dc5ec509a\"\u003e\n\n- Enter a name for the Function App (e.g., `ContosoFunctionAppAI`).\n- Choose your runtime stack (e.g., `.NET` or `Python`).\n- Select the region and other settings.\n\n  \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/e32fb474-c954-475b-971e-599f9909d35a\"\u003e\n\n- Select **Review + create** and then **Create**. Verify the resources created in your `Resource Group`.\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/352c95c7-bf6a-4e1d-937b-98dc018b69a4\"\u003e\n\n \u003e [!IMPORTANT]\n \u003e This example is using system-assigned managed identity to assign RBACs (Role-based Access Control).\n \u003e \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/dd5ce062-9720-40d3-b860-778048c38e6c\"\u003e\n\n- Please assign the `Storage Blob Data Contributor` and `Storage File Data SMB Share Contributor` roles to the `Function App` within the `Storage Account` related to the runtime (the one created with the function app).\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/a08f77bf-71d4-4922-8001-cf402e9e81f2\"\u003e\n\n- Assign `Storage Blob Data Reader` to the `Function App` within the `Storage Account` that will contains the invoices, click `Next`. Then, click on `select members` and search for your `Function App` identity. Finally click on `Review + assign`:\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/dcfdd7f0-f7a6-4829-876a-87383887e0e2\"\u003e\n\n- Also add `Cosmos DB Operator`, `DocumentDB Account Contributor`, `Cosmos DB Account Reader Role`, `Contributor`:\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/3ef8a8ae-4a98-4d1f-9ff0-61ac2829ebf9\"\u003e\n\n- To assign the `Microsoft.DocumentDB/databaseAccounts/readMetadata` permission, you need to create a custom role in Azure Cosmos DB. This permission is required for accessing metadata in Cosmos DB. Click [here to understand more about it](https://learn.microsoft.com/en-us/azure/cosmos-db/nosql/security/how-to-grant-data-plane-role-based-access?tabs=built-in-definition%2Ccsharp\u0026pivots=azure-interface-cli#prepare-role-definition).\n\n    | **Aspect**         | **Data Plane Access**                                                                 | **Control Plane Access**                                                                 |\n    |--------------------|---------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------|\n    | **Scope**          | Focuses on `data operations` within databases and containers. This includes actions such as reading, writing, and querying data in your databases and containers. | Focuses on `management operations` at the account level. This includes actions such as creating, deleting, and configuring databases and containers. |\n    | **Roles**          | - `Cosmos DB Built-in Data Reader`: Provides read-only access to data within the databases and containers. \u003cbr\u003e - `Cosmos DB Built-in Data Contributor`: Allows read and write access to data within the databases and containers. \u003cbr\u003e - `Cosmos DB Built-in Data Owner`: Grants full access to manage data within the databases and containers. | - `Contributor`: Grants full access to manage all Azure resources, including Cosmos DB. \u003cbr\u003e - `Owner`: Grants full access to manage all resources, including the ability to assign roles in Azure RBAC. \u003cbr\u003e - `Cosmos DB Account Contributor`: Allows management of Cosmos DB accounts, including creating and deleting databases and containers. \u003cbr\u003e - `Cosmos DB Account Reader`: Provides read-only access to Cosmos DB account metadata. |\n    | **Permissions**    | - `Reading documents` \u003cbr\u003e - `Writing documents` \u003cbr\u003e - Managing data within containers. | - `Creating or deleting databases and containers` \u003cbr\u003e - Configuring settings \u003cbr\u003e - Managing account-level configurations. |\n    | **Authentication** | Uses `Azure Active Directory (AAD) tokens` or `resource tokens` for authentication.                      | Uses `Azure Active Directory (AAD)` for authentication.                                 |\n\n\u003e Steps to assing it:\n\n1. **Open Azure CLI**: Go to the [Azure portal](portal.azure.com) and click on the icon for the Azure CLI.\n\n     \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/b2643d9a-7364-454a-bb24-f16270e99d92\"\u003e\n\n2. **List Role Definitions**: Run the following command to list all of the role definitions associated with your Azure Cosmos DB for NoSQL account. Review the output and locate the role definition named `Cosmos DB Built-in Data Contributor`.\n\n     ```powershell\n     az cosmosdb sql role definition list \\\n         --resource-group \"\u003cyour-resource-group\u003e\" \\\n         --account-name \"\u003cyour-account-name\u003e\"\n     ```\n    \n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/4c19d70e-d525-4c15-bb0e-518f50f61b37\"\u003e\n    \n3. **Get Cosmos DB Account ID**: Run this command to get the ID of your Cosmos DB account. Record the value of the `id` property as it is required for the next step.\n\n     ```powershell\n     az cosmosdb show --resource-group \"\u003cyour-resource-group\u003e\" --name \"\u003cyour-account-name\u003e\" --query \"{id:id}\"\n     ```\n\n     Example output:\n    \n     ```json\n    {                                                               \n      \"id\": \"/subscriptions/{subscription-id}/resourceGroups/{resource-group-name}/providers/Microsoft.DocumentDB/databaseAccounts/{cosmos-account-name}\"\n    }     \n     ```\n    \n     \u003cimg width=\"750\" alt=\"image\" src=\"https://github.com/user-attachments/assets/f3426130-2de5-46a0-96f0-4c6e15e57975\"\u003e\n\n4. **Assign the Role**: Assign the new role using `az cosmosdb sql role assignment create`. Use the previously recorded role definition ID for the `--role-definition-id` argument, the unique identifier for your identity for the `--principal-id` argument, and your `account's ID and the Function App` for the `--scope` argument. You need to do this for both the Function App to read metadata from Cosmos DB and your ID to access and view the information.\n  \n     \u003e You can extract the `principal-id`, from `Identity` of the `Function App`:\n    \n      \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/89c524ce-a392-4507-90ad-19becddff923\"\u003e\n    \n     ```powershell\n     az cosmosdb sql role assignment create \\\n         --resource-group \"\u003cyour-resource-group\u003e\" \\\n         --account-name \"\u003cyour-account-name\u003e\" \\\n         --role-definition-id \"\u003crole-definition-id\u003e\" \\\n         --principal-id \"\u003cprincipal-id\u003e\" \\\n         --scope \"/subscriptions/{subscriptions-id}/resourceGroups/{resource-group-name}/providers/Microsoft.DocumentDB/databaseAccounts/{cosmos-account-name}\"\n     ```\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/72e1e4f9-9228-4ec0-aade-20ad4aaa1f4f\"\u003e\n    \n    \u003e After a few minutes, you will see something like this:\n    \n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/5caea3b9-3e4d-4791-8376-fda4547547bd\"\u003e\n\n5. **Verify Role Assignment**: Use `az cosmosdb sql role assignment list` to list all role assignments for your Azure Cosmos DB for NoSQL account. Review the output to ensure your role assignment was created.\n    \n     ```powershell\n     az cosmosdb sql role assignment list \\\n         --resource-group \"\u003cyour-resource-group\u003e\" \\\n         --account-name \"\u003cyour-account-name\u003e\"\n     ```\n\n    \u003cimg width=\"750\" alt=\"image\" src=\"https://github.com/user-attachments/assets/8c7675cf-8183-433c-a21b-f9b7029642d9\"\u003e\n    \n### Configure/Validate the Environment variables\n\n- Under `Settings`, go to `Environment variables`. And `+ Add` the following variables:\n\n- `COSMOS_DB_ENDPOINT`: Your Cosmos DB account endpoint.\n- `COSMOS_DB_KEY`: Your Cosmos DB account key.\n- `contosostorageaidemo_STORAGE`: Your Storage Account connection string.\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/ab7cdaad-8939-4a82-99e3-5e7cfd24e908\"\u003e\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/effadf44-7304-4185-a55b-1eb76a5ab8b1\"\u003e\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/905aa59c-9083-4cad-8eb8-b73e5712d2df\"\u003e\n\n- Click on `Apply` to save your configuration.\n\n### Develop the Function\n\n- You need to install [VSCode](https://code.visualstudio.com/download)\n- Install python from Microsoft store:\n   \n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/30f00c27-da0d-400f-9b98-817fd3e03b1c\"\u003e\n\n- Open VSCode, and install some extensions: `python`, and `Azure Tools`.\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/715449d3-1a36-4764-9b07-99421fb1c834\"\u003e\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/854aa665-dc2f-4cbf-bae2-2dc0a8ef6e46\"\u003e\n\n- Click on the `Azure` icon, and `sign in` into your account. Allow the extension `Azure Resources` to sign in using Microsoft, it will open a browser window. After doing so, you will be able to see your subscription and resources.\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/4824ca1c-4959-4242-95af-ad7273c5530d\"\u003e\n\n- Under Workspace, click on `Create Function Project`, and choose a path in your local computer to develop your function.\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/8db80e51-725e-45fe-91b5-7158fd94be8e\"\u003e\n\n- Choose the language, in this case is `python`:\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/2fb19a1e-bb2d-47e5-a56e-8dc8a708647a\"\u003e\n\n- Select the model version, for this example let's use `v2`:\n \n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/fd46ee93-d788-463d-8b28-dbf2487e9a7f\"\u003e\n\n- For the python interpreter, let's use the one installed via `Microsoft Store`:\n\n    \u003cimg width=\"741\" alt=\"image\" src=\"https://github.com/user-attachments/assets/3605c959-fc59-461f-9e8d-01a6a92004a8\"\u003e\n\n- Choose a template (e.g., **Blob trigger**) and configure it to trigger on new PDF uploads in your Blob container.\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/0a4ed541-a693-485c-b6ca-7d5fb55a61d2\"\u003e\n\n- Provide a function name, like `BlobTriggerContosoPDFInvoicesRaw`:\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/9cbc7605-8cb3-41f8-b3f7-3dfc9b8e67b5\"\u003e\n\n- Next, it will prompt you for the path of the blob container where you expect the function to be triggered after a file is uploaded. In this case is `pdfinvoices` as was previously created.\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/7005dc44-ffe2-442b-8373-554b229b3042\"\u003e\n\n- Click on `Create new local app settings`, and then choose your subscription.\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/07c211d6-eda0-442b-b428-cdaed2bf12ac\"\u003e\n\n- Choose `Azure Storage Account for remote storage`, and select one. I'll be using the `contosostorageaidemo`. \n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/1ca2a494-2716-4b5a-8e7d-caca1eaf88ab\"\u003e\n\n- Then click on `Open in the current window`. You will see something like this:\n\n    \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/59fa0a79-2a23-4864-968f-9f933826dbca\"\u003e\n\n- Now we need to update the function code to extract data from PDFs and store it in Cosmos DB, use this an example:\n\n \u003e 1. **Blob Trigger**: The function is triggered when a new PDF file is uploaded to the `pdfinvoices` container. \u003cbr/\u003e\n \u003e 2. **PDF Processing**: The read_pdf_content function uses pdfminer.six to read and extract text from the PDF. \u003cbr/\u003e\n \u003e 3. **Data Extraction**: The extracted text is processed to extract invoice data. The `generate_id` function generates a unique ID for each invoice. \u003cbr/\u003e\n \u003e 4. **Data Storage**: The processed invoice data is saved to Azure Cosmos DB in the `ContosoAIDemo` database and `Invoices` container. \n\n \u003e `pdfminer.six` is an open-source framework. It is a community-maintained fork of the original PDFMiner,`designed for extracting and analyzing text data from PDF documents`. The framework is built in a modular way, allowing each component to be easily replaced or extended for various purpose\n\n- Update the `function_app.py`:\n\n  | Template Blob Trigger | Function Code updated |\n  | --- | --- |\n  | \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/a4ac6f2d-1419-4629-8896-de202c76000e\"\u003e | \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/a9e41cd7-9c3f-4da5-8526-5c7f06107a84\"\u003e | \n\n  ```python\n  import azure.functions as func\n  import logging\n  import json\n  import os\n  import uuid\n  import io\n  from pdfminer.high_level import extract_text\n  from azure.cosmos import CosmosClient, PartitionKey\n  \n  app = func.FunctionApp(http_auth_level=func.AuthLevel.FUNCTION)\n  \n  def read_pdf_content(myblob):\n      # Read the blob content into a BytesIO stream\n      blob_bytes = myblob.read()\n      pdf_stream = io.BytesIO(blob_bytes)\n      \n      # Extract text from the PDF stream\n      text = extract_text(pdf_stream)\n      return text\n  \n  def extract_invoice_data(text):\n      lines = text.split('\\n')\n      invoice_data = {\n          \"id\": generate_id(),\n          \"customer_name\": \"\",\n          \"customer_email\": \"\",\n          \"customer_address\": \"\",\n          \"company_name\": \"\",\n          \"company_phone\": \"\",\n          \"company_address\": \"\",\n          \"rentals\": []\n      }\n  \n      for i, line in enumerate(lines):\n          if \"BILL TO:\" in line:\n              invoice_data[\"customer_name\"] = lines[i + 1].strip()\n              invoice_data[\"customer_email\"] = lines[i + 2].strip()\n              invoice_data[\"customer_address\"] = lines[i + 3].strip()\n          elif \"Company Information:\" in line:\n              invoice_data[\"company_name\"] = lines[i + 1].strip()\n              invoice_data[\"company_phone\"] = lines[i + 2].strip()\n              invoice_data[\"company_address\"] = lines[i + 3].strip()\n          elif \"Rental Date\" in line:\n              for j in range(i + 1, len(lines)):\n                  if lines[j].strip() == \"\":\n                      break\n                  rental_details = lines[j].split()\n                  rental_date = rental_details[0]\n                  title = \" \".join(rental_details[1:-3])\n                  description = rental_details[-3]\n                  quantity = rental_details[-2]\n                  total_price = rental_details[-1]\n                  invoice_data[\"rentals\"].append({\n                      \"rental_date\": rental_date,\n                      \"title\": title,\n                      \"description\": description,\n                      \"quantity\": quantity,\n                      \"total_price\": total_price\n                  })\n  \n      logging.info(\"Successfully extracted invoice data.\")\n      return invoice_data\n  \n  def save_invoice_data_to_cosmos(invoice_data, blob_name):\n      try:\n          endpoint = os.getenv(\"COSMOS_DB_ENDPOINT\")\n          key = os.getenv(\"COSMOS_DB_KEY\")\n          client = CosmosClient(endpoint, key)\n          logging.info(\"Successfully connected to Cosmos DB.\")\n      except Exception as e:\n          logging.error(f\"Error connecting to Cosmos DB: {e}\")\n          return\n      \n      database_name = 'ContosoDBAIDemo'\n      container_name = 'Invoices'\n      \n      try:\n          database = client.create_database_if_not_exists(id=database_name)\n          container = database.create_container_if_not_exists(\n              id=container_name,\n              partition_key=PartitionKey(path=\"/invoice_number\"),\n              offer_throughput=400\n          )\n          logging.info(\"Successfully ensured database and container exist.\")\n      except Exception as e:\n          logging.error(f\"Error creating database or container: {e}\")\n          return\n      \n      try:\n          response = container.upsert_item(invoice_data)\n          logging.info(f\"Saved processed invoice data to Cosmos DB: {response}\")\n      except Exception as e:\n          logging.error(f\"Error inserting item into Cosmos DB: {e}\")\n  \n  def generate_id():\n      return str(uuid.uuid4())\n  \n  @app.blob_trigger(arg_name=\"myblob\", path=\"pdfinvoices/{name}\",\n                    connection=\"contosostorageaidemo_STORAGE\")\n  def BlobTriggerContosoPDFInvoicesRaw(myblob: func.InputStream):\n      logging.info(f\"Python blob trigger function processed blob\\n\"\n                   f\"Name: {myblob.name}\\n\"\n                   f\"Blob Size: {myblob.length} bytes\")\n  \n      try:\n          text = read_pdf_content(myblob)\n          logging.info(\"Successfully read and extracted text from PDF.\")\n      except Exception as e:\n          logging.error(f\"Error reading PDF: {e}\")\n          return\n  \n      logging.info(f\"Extracted text from PDF: {text}\")\n  \n      try:\n          invoice_data = extract_invoice_data(text)\n          logging.info(f\"Extracted invoice data: {invoice_data}\")\n      except Exception as e:\n          logging.error(f\"Error extracting invoice data: {e}\")\n          return\n  \n      try:\n          save_invoice_data_to_cosmos(invoice_data, myblob.name)\n          logging.info(\"Successfully saved invoice data to Cosmos DB.\")\n      except Exception as e:\n          logging.error(f\"Error saving invoice data to Cosmos DB: {e}\")\n  ```\n\n- Now, let's update the `requirements.txt`:\n\n| Template `requirements.txt` | Updated `requirements.txt` |\n| --- | --- |\n| \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/d7dec16e-4f78-446d-a7e0-4d3b1d43bec4\"\u003e | \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/b30e6450-515b-4070-b91b-8040ecbde738\"\u003e |\n\n```text\nazure-functions\npdfminer.six\nazure-cosmos==4.3.0\n```\n\n- Since this function has already been tested, you can deploy your code to the function app in your subscription. If you want to test, you can use run your function locally for testing.\n  - Click on the `Azure` icon.\n  - Under `workspace`, click on the `Function App` icon.\n  - Click on `Deploy to Azure`.\n\n     \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/a9f90f93-a8ce-467f-a675-f9b8737f5a3e\"\u003e\n\n  - Select your `subscription`, your `function app`, and accept the prompt to overwrite:\n\n     \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/a6492b14-491c-44d6-8d5a-f9ea670f174b\"\u003e\n\n  - After completing, you see the status in your terminal:\n\n     \u003cimg width=\"959\" alt=\"image\" src=\"https://github.com/user-attachments/assets/f742fb8e-4974-4222-b67e-c3dc6939740f\"\u003e\n\n     \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/94052759-4c0f-483d-8a78-5803fab5c961\"\u003e\n\n\u003e [!IMPORTANT]\nIf you need further assistance with the code, please click [here to view all the function code](./src/).\n\n## Step 5: Test the solution\n\n\u003e Upload sample PDF invoices to the Blob container and verify that data is correctly ingested and stored in Cosmos DB.\n\n- Click on `Upload`, then select `Browse for files` and choose your PDF invoices to be stored in the blob container, which will trigger the function app to parse them.\n\n   \u003cimg width=\"950\" alt=\"image\" src=\"https://github.com/user-attachments/assets/b45d203c-0392-4488-bf21-f9b6129c9709\"\u003e\n\n- Check the logs, and traces from your function with `Application Insights`:\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/b4ce7b7c-abab-4209-8af3-6c30dc5f667b\"\u003e\n\n- Under `Investigate`, click on `Performance`. Filter by time range, and `drill into the samples`. Sort the results by date (if you have many, like in my case) and click on the last one.\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/01bdebef-85af-4906-b9c0-3761e3a67c1f\"\u003e\n\n- Click on `View all`:\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/355ed2ec-ff2f-4571-99c2-6c4afc6c9aff\"\u003e\n\n- Check all the logs, and traces generated. Also review the information parsed:\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/4cc6ec56-5419-4668-aad1-7e59d8182ea5\"\u003e\n\n- Validate that the information was uploaded to the Cosmos DB. Under `Data Explorer`, check your `Database`:\n\n   \u003cimg width=\"550\" alt=\"image\" src=\"https://github.com/user-attachments/assets/28bba46b-eaf0-4bbc-a565-f7c1ad8a0ac6\"\u003e\n\n\u003c!-- START BADGE --\u003e\n\u003cdiv align=\"center\"\u003e\n  \u003cimg src=\"https://img.shields.io/badge/Total%20views-1286-limegreen\" alt=\"Total views\"\u003e\n  \u003cp\u003eRefresh Date: 2025-09-17\u003c/p\u003e\n\u003c/div\u003e\n\u003c!-- END BADGE --\u003e\n","funding_links":[],"categories":[],"sub_categories":[],"project_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmicrosoftcloudessentials-learninghub%2Fpdfs-invoice-processing-fapp-openframework","html_url":"https://awesome.ecosyste.ms/projects/github.com%2Fmicrosoftcloudessentials-learninghub%2Fpdfs-invoice-processing-fapp-openframework","lists_url":"https://awesome.ecosyste.ms/api/v1/projects/github.com%2Fmicrosoftcloudessentials-learninghub%2Fpdfs-invoice-processing-fapp-openframework/lists"}