How to Use AirgapAI in a Sensitive Compartmented Information Facility or No-Network Environment: Your Guide to Secure, Offline Artificial Intelligence
Become the mission partner who delivers artificial intelligence capability where the network cannot go.
In the critical domains of defense, intelligence, and the broader public sector, maintaining mission readiness is paramount. This detailed guide is designed to walk you through the entire workflow of AirgapAI, from understanding its foundational concepts to advanced usage, all within the stringent confines of Sensitive Compartmented Information Facilities (SCIFs) or other no-network environments. AirgapAI, developed by Iternal Technologies, is specifically engineered for these demanding scenarios, offering pre-provisioned models and datasets, sealed updates, and robust offline audit capabilities to ensure your operations remain secure and effective.
1. Introduction: Unlocking Artificial Intelligence Where Connectivity Cannot Reach
The digital age has ushered in an era where Artificial Intelligence (AI) promises to revolutionize how we process information, make decisions, and execute tasks. However, for organizations operating with highly sensitive data in secure, air-gapped environments, the promise of cloud-based AI solutions often comes with unacceptable security and data sovereignty risks. AirgapAI by Iternal Technologies steps into this void, providing a robust, entirely local, and highly accurate AI solution designed from the ground up for environments where network access is restricted or non-existent.
1.1 What Exactly is Artificial Intelligence (AI)?
At its core, Artificial Intelligence (AI) refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. It encompasses various fields, including machine learning, natural language processing, and robotics. For our purposes, we're focusing on AI that can understand and generate human language.
1.2 Understanding Large Language Models (LLMs)
A Large Language Model (LLM) is a type of Artificial Intelligence algorithm that uses deep learning techniques and massively large datasets to understand, summarize, generate, and predict new content. Think of it as a highly sophisticated digital assistant that has "read" an enormous amount of text and can now converse, write, and answer questions in a human-like way.
1.3 The Challenge with Cloud-Based Artificial Intelligence in Secure Environments
Most mainstream AI tools, such as Microsoft Copilot or ChatGPT, operate in the cloud, meaning your data is sent over the internet to remote servers for processing. While convenient, this model presents severe challenges for secure operations:
- Data Sovereignty and Control: Your sensitive information leaves your control and resides on third-party servers, potentially in other jurisdictions.
- Security Risks: Data transfer across networks, even encrypted ones, introduces potential interception points and vulnerabilities.
- Artificial Intelligence Hallucinations: Cloud models can sometimes generate incorrect or fabricated information, which is a critical risk when trust is paramount.
- Cost and Licensing: Cloud solutions typically involve recurring subscription fees, often with hidden token charges, leading to unpredictable and high costs.
- Network Dependency: They simply do not function without an active internet connection, rendering them useless in air-gapped or offline environments.
1.4 Introducing AirgapAI: Your Solution for Secure, Local, Offline Artificial Intelligence
AirgapAI, developed by Iternal Technologies, is an innovative answer to these challenges. It is a 100% local, on-device Large Language Model platform that runs entirely on your Artificial Intelligence Personal Computer (AI PC) without any external network connections. This means your data never leaves your device, ensuring complete data sovereignty and security.
1.5 The Imperative of AirgapAI in Sensitive Compartmented Information Facilities and No-Network Environments
A Sensitive Compartmented Information Facility (SCIF) is a secure room or area that prevents electronic surveillance and provides a safe place to process sensitive compartmented information. In such environments, or any scenario where network connectivity is impossible or forbidden (e.g., in remote field operations, on a submarine, or during communications blackouts), AirgapAI is not just an advantage—it's an absolute necessity. It allows intelligence analysts, defense personnel, and public sector teams to leverage the power of advanced AI for critical tasks, knowing their data is fully protected and always available.
2. AirgapAI's Core Value Propositions for Secure Operations
AirgapAI is meticulously engineered to address the unique demands of highly secure and offline operational environments. Its value propositions are built upon the pillars of trust, security, and efficiency.
2.1 Trusted: Precision and Reliability with Blockify Technology
In mission-critical scenarios, the accuracy of information is non-negotiable. AirgapAI leverages Iternal Technologies' patented Blockify data ingestion technology, which refines data inputs for highly accurate responses. This proprietary process leads to an astonishing 78 times (or 7,800%) improvement in Large Language Model accuracy, virtually eliminating Artificial Intelligence hallucinations. You can trust the insights generated by AirgapAI because its foundation is built on verified, curated data.
2.2 Secure: Absolute Data Sovereignty
AirgapAI runs 100% locally on your Artificial Intelligence Personal Computer with no network "in" or "out." This fundamentally eliminates the risk of external network breaches or unintended data leakage to the cloud. It is ideal for zero-trust environments and compliance with the strictest data sovereignty requirements, ensuring that sensitive data always remains within your organizational control and within the physical boundaries of your secure facility.
2.3 Cost-Effective: Own Your Artificial Intelligence
Unlike subscription-based cloud AI solutions that incur ongoing per-user fees, often with hidden token charges, AirgapAI offers a one-time, perpetual license per device. This translates to a cost that is approximately 1/10th to 1/15th of alternatives like Microsoft Copilot or ChatGPT Enterprise, providing significant savings and predictable budgeting without hidden overage bills.
2.4 Offline Access: Continuous Operation Anywhere
Designed for air-gapped use cases, AirgapAI operates fully offline. This ensures that critical AI capabilities are accessible whether you are on a mountain, at the bottom of the ocean, or deep within a Sensitive Compartmented Information Facility. Your team can stay productive and informed regardless of network availability.
2.5 Robust Capability: Comprehensive Artificial Intelligence Assistance
AirgapAI is not just a basic chat interface. It provides a cutting-edge Generative Artificial Intelligence Large Language Model chat assistant with robust capabilities, including:
- Complex Document Processing: Quickly analyze and distill insights from large volumes of text.
- Retrieval-Augmented Question and Answer (QA): Get precise, cited answers from your trusted datasets.
- Entourage Mode: Simulate multi-persona consultations for diverse expert perspectives on complex issues.
- Role-Based Workflows: Pre-configured prompts and datasets tailored to specific user roles, ensuring relevant and secure interactions.
3. Understanding AirgapAI's Core Technologies: The Engine of Mission-Ready Artificial Intelligence
To fully appreciate and utilize AirgapAI, it is important to understand the technologies that power its secure, accurate, and offline capabilities.
3.1 Blockify Technology: The Foundation of Trusted Data
Blockify is Iternal Technologies' ultimate data management solution for Large Language Models at scale. It transforms messy, unstructured data into a precise, query-optimized knowledge base, ensuring superior accuracy for your AI.
3.1.1 What Blockify Does and Why it Matters
The biggest challenge for enterprise AI is often the quality and trustworthiness of the data it uses. Traditional methods of feeding documents to an AI can lead to inaccuracies because enterprise data is often redundant, outdated, or poorly structured. Blockify addresses this by:
- Ingesting Large Datasets: It can process thousands of documents (e.g., sales documents, Request for Proposal (RFP) responses, technical manuals, intelligence reports).
- Condensing into Modular "Blocks": It distills these documents into concise, optimized data units called "blocks." Each block is structured with:- A Name: Displayed in blue in the interface to quickly identify the content topic.
- A Critical Question: The most important query a user might ask related to the block's content.
- A Trusted Answer: A distilled, accurate, and approved response, avoiding the pitfalls of outdated or redundant information.
 
- Enhancing Security and Data Lifecycle Management: Each block is tagged with rich metadata, including classification levels, permissions, and data classification, crucial for zero-trust environments. This ensures that the AI only accesses information relevant and permissible to the user.
- Delivering Unparalleled Accuracy: This meticulous process can reduce the original data size by as much as 97.5% (down to 2.5% of the original) and, most remarkably, improve the accuracy of your Large Language Models by 7,800%. This drastic reduction in Artificial Intelligence hallucinations is critical for building confidence and trust in AI outputs, especially in high-stakes environments.
3.1.2 Data Ingestion and File Compatibility
Blockify supports a wide range of common file formats:
- Document Formats: Text (.txt), HyperText Markup Language (.html), Portable Document Format (.pdf), Microsoft Word (.docx), Microsoft PowerPoint (.pptx).
- Graphic Files: Images can be processed by extracting still frames or transcribing embedded audio, if applicable.
For optimal results, it is recommended that customer data be curated into relevant categories (e.g., specific product lines, business units, intelligence domains) to leverage Blockify's hierarchical metadata and taxonomy framework effectively. As new documents are Blockified, the updated datasets can be securely pushed to local devices via standard image management applications like Microsoft Intune.
3.2 AirgapAI Application Overview: Your Local Artificial Intelligence Hub
AirgapAI is designed to be as intuitive and accessible as mainstream chat applications, yet it operates with unparalleled security and local control.
- Local, Modular Application: It functions as a locally installed, ChatGPT-like application that leverages open-source Large Language Models.
- On-Device Operation: AirgapAI runs entirely on the client device (such as a Dell Precision workstation or any Artificial Intelligence Personal Computer), eliminating external network dependencies.
- Seamless Distribution and Updates:- It is distributed as an executable application that integrates seamlessly into standard Windows imaging workflows. This means it can be pre-loaded onto devices just like Microsoft Word or Excel.
- Application updates and dataset updates can be pushed to the device by IT departments using the same image management applications employed for other software, ensuring controlled and secure deployment.
 
- Flexible Model Integration:- Users can "Bring Your Own Model" (BYOM) by uploading their preferred Large Language Models.
- Alternatively, choose from a suite of pre-quantized, open-source models included with the application (e.g., Llama, Mistral, DeepSeek). Our engineering team can also package and deploy other needed models as a service.
 
- Optimal Hardware Utilization: AirgapAI is designed to leverage all available compute resources on an AI PC, including the Central Processing Unit (CPU), Graphics Processing Unit (GPU), and Neural Processing Unit (NPU). This ensures maximum performance regardless of your device's configuration.- CPU: Excellent for searching through millions of records with extreme low latency.
- GPU: Highly performant for running larger Large Language Models, often integrated into modern AI PCs without additional costs.
- NPU: Handles sustained, heavily-used Large Language Model Artificial Intelligence workloads at low power, enhancing efficiency and battery life—critical for field operations.
 
3.3 Additional Unique Features for Enhanced Productivity and Security
- Role-Based Workflows: AirgapAI includes Quick Start workflows tailored for different user roles. Whether you are in procurement, legal, engineering, or an intelligence analyst, you can have pre-configured prompts that automatically select relevant, curated datasets. This streamlines interaction and ensures data access is appropriate for the user's function.
- Entourage Mode: This unique feature allows users to interact with multiple Artificial Intelligence personas simultaneously.- For example, when conducting a complex intelligence assessment, you can configure one persona as a Central Intelligence Agency (CIA) analyst and another as a military tactician. The CIA analyst persona would be set up with expertise in intelligence gathering, target package details, and sensitive data interpretation. The military tactician persona would be tuned to provide insights on ground operations, combat strategies, and tactical decision-making. Users can simultaneously ask the same question and receive distinct answers from each expert, giving a multi-perspective view on complex issues. This multi-persona approach supports high-stakes decision-making and scenario planning by combining diverse expert viewpoints, all within a secure, offline environment.
 
4. Setting Up AirgapAI for Mission Readiness: Your Pre-Mission Readiness Checklist
Deploying AirgapAI in a Sensitive Compartmented Information Facility or no-network environment requires careful pre-mission preparation. This section details the steps for installation, onboarding, and initial configuration.
4.1 System Requirements & Prerequisites
Ensure your Artificial Intelligence Personal Computer (AI PC) meets the following specifications for optimal AirgapAI performance:
| Component | Minimum | Recommended | 
|---|---|---|
| Central Processing Unit (CPU) | 8 Cores | 8 Cores / 16 Threads or better | 
| Random Access Memory (RAM) | 16 Gigabytes | 32 Gigabytes or more | 
| Disk Storage | 10 Gigabytes free (Solid State Drive) | 50 Gigabytes Non-Volatile Memory Express (NVMe) | 
| Graphics Processing Unit (GPU) | 4 Gigabytes Video Random Access Memory (VRAM) 2024 or Newer | 8 Gigabytes Video Random Access Memory (VRAM) or more | 
| Operating System (OS) | Windows 11 | Latest patches | 
| Permissions | Security permissions to install | 
4.2 Downloading and Installing the Installer Package (No Network)
Since AirgapAI operates in a no-network environment, the installer package and initial models/datasets must be pre-provisioned.
- Obtain Installer Package: Your IT department will provide the latest AirgapAI ZIP archive (e.g., AirgapAI-v1.0.2-Install.zip) via a secure, approved transfer method (e.g., encrypted Universal Serial Bus (USB) drive, secure internal server access).
- Save Securely: Save the ZIP archive to a designated, writeable folder on your device.
- Extract All Files:- Right-click on the AirgapAI-v1.0.2-Install.zipfile.
- Select "Extract All..."
- Choose a destination folder (the default usually creates a new folder under the current directory). Click "Extract."
 
- Right-click on the 
- Launch Installer:- Open the newly extracted folder.
- Double-click AirgapAI Chat Setup.exe.
 
- Follow the Installer Wizard:- Accept the license agreement.
- Choose to create a Desktop Shortcut (recommended for easy access).
- Click "Install."
- Click "Finish" once complete.
- If your Operating System's security features (e.g., SmartScreen) prompt you, choose "Allow" or "Run anyway," as this is an approved internal application.
 
4.3 First-Launch Onboarding Wizard (Post-Installation Setup)
Upon first launch, AirgapAI Chat will check for existing models. If none are found (which will be the case for a fresh installation), the Onboarding flow will begin.
- Start Onboarding: Click the "Start Onboarding" button.
- Profile & Chat Style:- Enter a display name (default is "You").
- Pick a preferred Chat Style (e.g., Iternal Professional, Casual, Dark Mode, Retro). This customizes the user interface.
- Click "Next."
 
- Uploading the Core Large Language Model (LLM):- On the Models screen, expand the "Available Models" drop-down. It will be empty initially.
- Click "Upload Model."
- Browse to the /models/subfolder within the folder where you extracted the installer package.
- Choose a model suited to your hardware:- Llama-1B: Suitable for 2024 Integrated Graphics Processing Units (iGPU) or low-power devices.
- Llama-3B: Recommended for iGPUs from 2025 or dedicated Graphics Processing Units (GPU).
 
- Click "Save." The upload typically takes about 30 seconds.
- Note for Administrators: You can also add or update Large Language Models by accessing the folder created after model upload, which is typically found within %appdata%atC:\Users\[Your_Username]\AppData\Roaming\IternalModelRepo.
 
- Uploading an Embeddings Model:- Still on the onboarding page, click "Upload Embeddings Model."
- Open the /models/folder again and selectJina-Embeddings.zip.
- Click "Save." This upload also takes approximately 30 seconds.
- Note for Administrators: Similar to Large Language Models, Embeddings Models can be managed by modifying the contents of the C:\Users\[Your_Username]\AppData\Roaming\IternalModelRepofolder.
 
- Adding Sample or Custom Datasets:- Retrieval-Augmented Generation (RAG) is powered by datasets. These are crucial for secure, accurate answers from your proprietary data.
- Click "Upload Dataset."
- Navigate to the /datasets/folder from the install package.
- Select a sample dataset, for example, CIA_World_Factbook_US.jsonl.
- Click "Save."
- Note for Administrators: Datasets loaded on the system can be updated by modifying the contents of the files saved within %appdata%atC:\Users\[Your_Username]\AppData\Roaming\airgap-ai-chat\CorpusRepo.
- Tip: While you can upload Microsoft Word, Portable Document Format (PDF), or Text (.txt) files directly, converting larger corpora to Blockify format dramatically increases accuracy (up to 78 times). Local, on-device Blockify will be available in Quarter 3 (Q3) of 2025.
 
- Finish Onboarding: Verify that the core Large Language Model, Embeddings Model, and at least one dataset are added, then click "Continue." AirgapAI Chat will now boot with your selections, ready for use.
4.4 Optional Setup Steps for IT Administrators
These steps are not required for basic user functionality but are valuable for IT teams managing AirgapAI within a larger deployment.
4.4.1 Dell Technologies Dell Pro Artificial Intelligence Studio (DPAIS) Support
AirgapAI Chat supports native integration with Dell Technologies’ Dell Pro Artificial Intelligence Studio (DPAIS).
- As the IT Systems administrator, install the required files to enable a Large Language Model via DPAIS (both Intel and Qualcomm are supported).
- After DPAIS services are running and you've validated that local Large Language Model Application Programming Interface (API) endpoints can be called, open PowerShell (as administrator) and input the following command:[System.Environment]::SetEnvironmentVariable("DPAIS_ENDPOINT", "http://localhost:8553/v1/openai", "User")
- Relaunch the AirgapAI Chat application, and the DPAIS Large Language Models available will automatically appear in the model selection menu in the settings page.
4.4.2 Update File Server Configuration for Sealed Updates
For managing updates in air-gapped environments, IT can configure AirgapAI to look for updates from a local server.
- You can change the file server update location by modifying the updaterConfig.jsonfile located atC:\Users\[Your_Username]\AppData\Local\Programs\AirgapAI Chat\resources\auto-updater\updaterConfig.json.
- This JavaScript Object Notation (JSON) file will host a structure similar to:
 IT administrators can modify the{"win32-x64-prod":{"readme":"","update":"https://d30h3ho4go3k4y.cloudfront.net/releases/prod/public/chat-assistant/prod/public/1.0.2/AirgapAI Chat Setup 1.0.2.exe","install":"https://d30h3ho4go3k4y.cloudfront.net/releases/prod/public/chat-assistant/prod/public/1.0.2/AirgapAI Chat Setup 1.0.2.exe","version":"1.0.2"}}updateandinstallUniform Resource Locators (URLs) to point to a securely provisioned local update server within the air-gapped environment.
5. Training and Everyday Workflows in a Secure Environment
Once AirgapAI is installed and configured, you can immediately begin leveraging its powerful Artificial Intelligence capabilities.
5.1 Initial Model Benchmarking
On first model launch, AirgapAI Chat will offer to Benchmark your hardware.
- Click "Run Benchmark" (recommended).
- Duration: Approximately 2 minutes. This process measures tokens per second and inference speed, providing critical performance data.
- You can skip, but the context-size limits will remain at a conservative 2,000 tokens until a benchmark is completed.
- You can change the token context window after benchmarking by visiting "Settings > Chat" and dragging the slider to the desired size (up to 32,000 tokens after benchmarking).
5.2 AirgapAI Chat User Interface Tour
The AirgapAI interface is designed to be familiar and intuitive, similar to popular chat applications.
- Chat Window: The central area where you interact with the AI.
- Workflow Bar: Located below the new chat window, this bar provides quick access to pre-configured tasks and templates.
- Sidebar: Allows you to toggle datasets on or off, view chat history, and manage settings.
- Prompt Entry Box: Where you type your questions or commands for the AI.
5.3 Everyday Workflows with Secure Data
Here's how to execute common tasks using AirgapAI, keeping in mind the secure, offline nature of the environment.
5.3.1 File Upload & Summarization
This workflow allows you to quickly get insights from local documents without ever sending them to the cloud.
- Method 1: Drag and Drop: Drag a file (.pdf, .docx, .txt) directly onto the chat window.
- Method 2: Click Attachment Icon: Click the paperclip icon (📎) in the chat interface to browse and select a file.
- Prompt Example: Once uploaded, you can type a prompt like: "Summarize this document in bullet points, focusing on key operational procedures."
- Result: AirgapAI will embed and summarize the document instantly, leveraging its local Large Language Model.
5.3.2 Guided Demo Workflows
AirgapAI includes Quick Start workflows that streamline common business tasks by pre-configuring prompts and dataset selections.
- Access Workflows: Locate the Workflow Bar below the new chat window.
- Select Workflow: Choose a relevant workflow (e.g., "Sales Proposal – Cover Letter," "Intelligence Briefing Outline").
- Upload Supporting Document(s): If the workflow requires external information, upload the necessary local documents.
- Enter Prompt: Provide a minimal or robust prompt (e.g., "Write a cover letter for the X-Ray project," or "Generate an outline for a threat assessment on Region Alpha").
- Receive Output: The system will generate a fully-engineered output tailored to the workflow.
- Copy Output: Click the "Copy" icon (📋) to place the text on your clipboard for use in other applications.
5.3.3 Retrieval-Augmented Question and Answer (QA) with Blockify Datasets
This is where the power of your Blockify-processed, trusted data truly shines, enabling highly accurate and cited responses.
- Toggle Dataset On: In the sidebar, toggle your desired dataset ON (e.g., "CIA World Factbook for USA dataset," or your organization's proprietary intelligence dataset).
- Ask a Question: Type a specific question that the dataset is designed to answer. Example: "What are the major political parties in the United States, and what are their primary platforms?" or "Describe the operational capabilities of Unit 731 as detailed in the classified document set."
- Receive Cited Answer: The Retrieval-Augmented Generation (RAG) engine will fetch relevant "IdeaBlocks" from your local dataset, and the Large Language Model will synthesize a coherent, trusted answer, showing citations to the original blocks of information. This ensures transparency and verifiability, crucial for high-stakes analysis.
5.3.4 Entourage Mode (Multi-Persona Chat)
Entourage Mode allows you to gain diverse perspectives by having multiple AI personas interact with the same query, drawing from their respective specialized datasets.
- Select Entourage Mode: Choose an Entourage Mode Quick Start workflow from the new chat page.
- Configure Personas: Go to "Advanced Settings > Personas" to set up or select predefined personas (e.g., "Marketing Analyst," "Legal Counsel," "Engineering Expert," or in a secure context, "Cybersecurity Analyst," "Military Strategist," "Logistics Planner"). Each persona is typically linked to specific Blockify datasets relevant to its expertise.
- Ask a Question: Pose a complex question.- Recommended Prompt Example: "I am launching a new initiative called AirgapAI, it is a 100% local chat Large Language Model solution that is 1/10th the cost of other solutions with more capabilities, what do you think? Please answer in short sentences."
- Secure Prompt Example: "Based on available intelligence, what are the potential vulnerabilities and counter-measures for a forward operating base against a combined land and air threat? Provide insights from a 'Threat Analyst' and a 'Defensive Operations Commander' persona."
 
- Review Responses: Responses from each persona will appear in a queue, often with a persona activity indicator showing which persona is "typing." This provides a multi-faceted analysis without exposing your query to external networks.
5.3.5 Multilingual Conversations
If the deployed Large Language Model supports it, AirgapAI can seamlessly switch between languages.
- Prompt Example: "Tell me a short story in German about renewable energy." or "Summarize the captured enemy communications in Farsi, then translate the key points into English."
- Result: The Large Language Model will generate content in the requested language. Use the "Stop" button to halt generation at any time.
5.4 Role-Based Workflows and Multi-User Access
AirgapAI is designed for enterprise deployment, supporting multiple users on a single device while maintaining personalized, secure experiences.
- User Profiles: Because the application is tied to the user's profile on login, you can have multiple users of the same device, each leveraging AirgapAI with their own isolated experiences and datasets.
- IT Configuration: This is configured per user profile through your standard image and provisioning process, ensuring that each user only accesses datasets and workflows appropriate for their role and security clearance, all locally on the device.
6. Maintaining AirgapAI in a Secure Environment: Sealed Updates and Offline Audits
In an air-gapped environment, updates and maintenance require a controlled, secure protocol to ensure the integrity and security of the system.
6.1 Ongoing Updates and Maintenance (Sealed Updates)
AirgapAI is designed for secure, auditable updates that do not compromise the air-gapped status of your systems.
- Synchronized Update Cadence: AirgapAI's update cadence is synchronized with your typical Operating System or enterprise software update cycle.
- IT-Managed Deployment: Whether pushing new data, security patches, or application version updates, IT can deploy new versions through familiar image management solutions (e.g., Microsoft Intune, System Center Configuration Manager (SCCM), or other internal tools).
- Implications for SCIF/No-Network:- Secure Update Channels: Updates must be transferred into the air-gapped environment using approved, physically secure methods (e.g., encrypted media, "Sneakernet" via certified personnel).
- Verification and Auditing: Each update package should be rigorously scanned and verified by your security teams before deployment. AirgapAI's design facilitates offline audits and the creation of "evidence packs" to document changes and ensure compliance.
- No Automatic Cloud Updates: AirgapAI will never attempt to connect to external cloud servers for updates, respecting the air-gapped configuration.
 
6.2 Model & Dataset Management
As new information becomes available, or as Large Language Models evolve, IT administrators can manage and update the models and datasets used by AirgapAI.
- Model Updates: Admins can add or update Large Language Models by modifying the contents of the C:\Users\[Your_Username]\AppData\Roaming\IternalModelRepofolder.
- Dataset Updates: As new documents are Blockified, updated datasets can be pushed to the local devices by modifying the contents of the C:\Users\[Your_Username]\AppData\Roaming\airgap-ai-chat\CorpusRepofolder. This ensures that the AI's knowledge base remains current and accurate for your mission.
6.3 Advanced Configuration for IT and Power Users
AirgapAI offers advanced settings for fine-tuning performance and customization.
- Context-Window Expansion: After completing the initial model benchmark, go to "Settings > Model Settings" and adjust the "Max Tokens" slider. You can set the context window up to 32,000 tokens, allowing the Large Language Model to process much larger chunks of information for more comprehensive responses.
- Styling & Themes: In "Settings > Appearance," you can switch between predefined themes or, for advanced users, build custom Cascading Style Sheets (CSS) to tailor the user interface.
- Workflow Templates: In "Settings > Workflows," IT administrators can add or edit prompt chains. This is ideal for pre-loading company-specific tasks and standardizing Artificial Intelligence interactions across the organization, ensuring consistency and adherence to protocols.
- In-App Benchmarking Suite: The "Settings > Benchmarking" tab allows you to test the performance of new or updated Large Language Models, providing valuable data on tokens per second and inference speed.
7. Troubleshooting and Support in Restricted Environments
While AirgapAI is designed for stability and ease of use, issues may arise. Troubleshooting in a Sensitive Compartmented Information Facility or no-network environment requires a specific approach.
- Internal Knowledge Base: Rely heavily on your internal documentation, help desk, and support teams first. AirgapAI's design for enterprise integration means IT teams can easily manage and troubleshoot common issues.
- Offline Diagnostics: Utilize the in-app benchmarking suite and system logs (if available) for offline diagnostics.
- Contacting External Support: For issues that cannot be resolved internally, Iternal Technologies provides support via secure channels. Contact the product team at support@iternal.aithrough your organization's approved, secure external communication methods. Be prepared to provide detailed logs and system information obtained offline.
8. Conclusion: Your Partner for Mission-Ready Artificial Intelligence
AirgapAI, by Iternal Technologies, stands as a testament to secure, powerful, and accessible Artificial Intelligence, purpose-built for the most demanding environments. By deploying AirgapAI on your Artificial Intelligence Personal Computer, you are not just adopting a new technology; you are empowering your workforce with mission-critical capabilities where connectivity cannot go. You are ensuring:
- Unparalleled Security: 100% local operation means zero data leakage risk.
- Trusted Accuracy: Blockify technology delivers 78 times more accurate Artificial Intelligence results.
- Cost-Effectiveness: A perpetual license provides significant savings over cloud alternatives.
- Offline Readiness: Continuous operation, anytime, anywhere, ensuring mission continuity.
AirgapAI is your strategic advantage, transforming the challenge of secure, air-gapped operations into an opportunity for advanced Artificial Intelligence deployment.
Download the free trial of AirgapAI today and experience mission-ready Artificial Intelligence firsthand at: https://iternal.ai/airgapai