Data protection information on the use of InnoGPT
This document comprehensively describes the data protection-relevant processes, technical and organizational measures as well as the principles of data processing when using the innoGPT platform. The presentation is aimed at data protection officers, compliance officers and other professionally responsible persons in organizations that use InnoGPT or review its use.
1. Principles and Responsibilities
Purpose and scope
This document describes the data protection-relevant processing processes of the innoGPT platform as a technical infrastructure for AI services. InnoGPT acts as a contract processor in accordance with Art. 28 GDPR for the content entered or uploaded by customers.
The scope of application includes exclusively the technical processing carried out by InnoGPT to provide the platform functionalities. This includes the transfer, temporary processing and storage of user content, the provision of AI interfaces, and the implementation of security and deletion functions.
Important notice of responsibility: The content of all texts, uploaded files and generated content by users lies exclusively with the respective customer as the person responsible in accordance with Art. 4 No. 7 GDPR. InnoGPT processes this content exclusively in accordance with instructions as part of the technical service.
Categories of processed personal data
InnoGPT processes the following categories of data, which are entered or uploaded to the platform by users on their own responsibility:
Content data entered by users: All text inputs (prompts), uploaded documents and files, user-defined reminders and notes. The input is voluntary by the user; InnoGPT does not actively collect this data.
System-generated data: Responses generated by AI models to user inquiries, chat processes resulting from interaction between user and system.
Technical operating data: Metadata such as time stamps of interactions, AI models used, workspace configurations, session data, browser information and connection parameters required for the technical operation of the platform.
User management data: First and last name, email address, workspace membership, user role and permissions, login timestamps required to provide the service.
Note: What personal data is contained in user content is beyond InnoGPT's control and is the sole responsibility of the customer as the controller.
Responsibilities and distribution of roles
InnoGPT as contract processor: InnoGPT acts as a contract processor for all content entered or uploaded by users in accordance with Art. 28 GDPR. Processing is carried out exclusively in accordance with instructions to provide the technical platform functionalities. InnoGPT has no influence on the type, scope or purpose of the personal data contained in user content.
Customer as responsible person: As responsible in accordance with Art. 4 No. 7 GDPR, the customer is responsible for all content that its users enter or upload to the platform. This includes determining the purposes and means of processing, ensuring valid legal bases, compliance with the principles in accordance with Article 5 GDPR and fulfilling information obligations vis-à-vis data subjects.
Order processing contract: The cooperation between InnoGPT and the customer is governed by an order processing contract in accordance with Art. 28 GDPR, which defines in detail the technical and organizational measures, authority to issue instructions, deletion obligations and audit rights.
User Responsibility and Compliance Obligations
Legality of user content: Users are exclusively responsible for ensuring that all content they enter or upload complies with applicable data protection regulations. This includes in particular the existence of valid legal bases for processing personal data and compliance with the principles of data minimization and purpose limitation.
Required consents and information: Before entering personal data from third parties, users must ensure that all necessary consents have been obtained or that other legal bases have been met. Data subjects must also have been duly informed of the processing.
Data protection impact assessment: Customers are required to check whether a data protection impact assessment in accordance with Article 35 GDPR is required for their planned use of the platform. InnoGPT provides the necessary technical information for this purpose, but the legal assessment is the responsibility of the customer.
Documentation requirements: Customers must properly document their processing activities and keep a register of processing activities in accordance with Art. 30 GDPR, which also includes the use of InnoGPT.
Disclaimers and usage restrictions
User Content Disclaimer: InnoGPT assumes no responsibility for the legality, accuracy, or appropriateness of user content. The platform acts as a technical tool; the content and legal responsibility lies exclusively with the user or customer.
Exclusion of content review: InnoGPT does not systematically review user content for its admissibility under data protection law. Automated security checks are used exclusively to protect the technical infrastructure, not to monitor legal compliance.
Usage restrictions: The use of the platform for illegal purposes, in particular the processing of personal data without a valid legal basis, is prohibited. Violations can result in immediate account suspension.
Reporting requirements: InnoGPT immediately notifies customers of any data breaches that could affect processed content. The customer, as the responsible person, is responsible for evaluating the significance and reporting to supervisory authorities and data subjects. InnoGPT helps to clarify technical aspects and provides relevant information.
Limits of technical support: The technical functions provided by InnoGPT (export, deletion, etc.) do not release the customer from his responsibility as the person responsible. The legal evaluation and decision on the application of data subject rights is the sole responsibility of the customer.
2. Legal basis and compliance
GDPR compliance and processing principles
Technical processing by InnoGPT is carried out in full compliance with the General Data Protection Regulation (GDPR) and other applicable data protection laws. The application of the GDPR principles is differentiated depending on the role of InnoGPT:
For technical platform data (InnoGPT as responsible person):
Lawfulness, processing in good faith, transparency: Technical data (user administration, system logs, metadata) is processed on clear legal bases in accordance with Article 6 GDPR. The processing is transparently documented and is comprehensible.
Purpose: Technical data is processed exclusively for the provision and secure operation of the platform.
Data minimization: Only technical data required for platform operation is processed.
For user content (InnoGPT as contract processor):
Processing in accordance with instructions: The processing of user content is carried out exclusively in accordance with the instructions of the customer as the person responsible. InnoGPT only provides the technical infrastructure.
Compliance with principles by customers: Compliance with the GDPR principles (legality, purpose limitation, data minimization, accuracy, storage limit) for user content is the responsibility of the customer. InnoGPT supports them technically with appropriate functionalities.
Joint technical and organizational measures:
Integrity and confidentiality: Appropriate technical and organizational measures ensure the security of all data against unauthorised or unlawful processing and against accidental loss, destruction or damage.
Storage limitation: Automated deletion processes and clear retention rules are technically implemented and can be configured in coordination with InnoGPT.
Legal basis for processing
Processing by InnoGPT is carried out on the following legal bases:
For technical services and user administration:
• Art. 6 para. 1 lit. b GDPR (contract fulfillment): Provision of the technical platform, user administration, system access and basic functionalities.
• Art. 6 para. 1 lit. f GDPR (legitimate interests): System administration, security monitoring, fault diagnosis and performance optimization in the legitimate interest of proper operation.
For user content (order processing):
• Art. 28 GDPR: InnoGPT processes all content entered or uploaded by users exclusively as a contract processor on the customer's instructions. The legal basis for this processing must be determined and documented by the customer as the controller.
Important note: Customers are required to ensure and document a valid legal basis in accordance with Article 6 GDPR for all personal data they enter into the platform.
Data subject rights and their implementation
The implementation of data subject rights under Articles 15-22 GDPR is divided between InnoGPT and the customer:
Responsibility of InnoGPT (technical support):
• Provide technical features for data extraction, correction, and deletion in the user interface
• Implementation of customer deletion requests within 48 hours
• Export of user data in structured format (JSON) upon customer request
• Logging and documentation of technical measures carried out
Customer responsibility (legal obligation):
• Receipt and legal assessment of data subject inquiries
• Verification of the identity and authorization of the person making the request
• Deciding on the scope and nature of the rights to be granted
• Communication with affected persons
• Ensuring lawful processing within your own area of responsibility
Note: InnoGPT cannot independently answer data subject inquiries relating to user content, as the legal responsibility lies with the customer. Appropriate inquiries are forwarded to the customer.
3. Technical infrastructure and safety
Infrastructure and hosting architecture
Core data processing is carried out primarily in the Federal Republic of Germany via AWS data centers in Frankfurt am Main. The frontend is hosted by Vercel, the database is provided via Supabase, and both use AWS infrastructure in Frankfurt. In the event of exceptional load peaks or technical faults, other EU regions can be used temporarily.
The hosting architecture is based on cloud-native services with automated scaling and integrated failover mechanisms. Availability and data integrity are ensured by cloud providers' redundant systems. Security updates and infrastructure security are the responsibility of the respective providers (AWS, Vercel, Supabase), who have the appropriate certifications (ISO 27001, SOC 2 Type II).
All infrastructure partners (Vercel, Supabase, AWS) are exclusively involved within the framework of order processing contracts in accordance with Art. 28 GDPR. These contracts contain detailed requirements for technical and organizational measures, deletion obligations, audit rights and liability regulations.
Encryption and security measures
Transfer encryption: All data transfers between user devices and the platform are carried out via Transport Layer Security (TLS) version 1.3 or higher. Encryption uses strong cipher suites and perfect forward secrecy to ensure the confidentiality of past communications even if private keys are compromised.
Storage encryption: All persistent data is secured by cloud providers (AWS, Vercel, Supabase) with industry-standard encryption (AES-256). Key management is carried out by the native security services of the respective providers (AWS KMS, etc.) with automated security standards. Sensitive authentication data and session tokens are subject to additional protection measures at the application level.
Access control: A multi-level authorization concept regulates access to various system levels. Administrative access to production systems requires multi-factor authentication and is logged. Privileged access is limited in time and is subject to the need-to-know principle.
Client separation: The logical isolation of different workspaces is carried out via row-level security (RLS) at database level. Each workspace receives a unique client ID, which is stored in all relevant data sets. Database queries are implemented in such a way that only data from the corresponding client can be retrieved.
Management of AI models and international transfers
Workspace administrators can granularly define which AI models are available for their team. The processing location and hosting category are shown transparently for each model:
Category 1 — EU hosting/EU processing: Models that are hosted and processed entirely within the European Union. There is no transfer to third countries here.
Category 2 - EU hosting/global processing: Models that are primarily hosted in the EU, but which can be processed in other regions during peak loads. For such transfers, there are adequacy decisions by the EU Commission or standard contractual clauses are applied.
Category 3 — Global Hosting/Global Processing: Models that are hosted and processed outside the EU. Here, transfers to third countries are made on the basis of adequacy decisions or standard contractual clauses with additional technical and organizational measures.
Customers can make their choices based on their specific compliance requirements.
4. Data processing and flow
Detailed data flow description
Phase 1 - User request and security validation: The user's input is first subjected to an automated security check to identify potentially harmful content or attempts at misuse. In parallel, the user is authenticated and authorized as well as a check of workspace authorizations.
Phase 2 - Contextualization and Enrichment: The system analyses the request in the context of the conversation so far. Relevant messages from the conversation history and suitable content from the user-defined reminders are identified and compiled. This process is carried out using semantic search methods and machine learning algorithms to determine relevance.
Phase 3 - Model selection and submission: Based on the type of request and workspace settings, the optimal AI model is selected. The contextualized data package is transferred to the appropriate model provider via a secure API interface. Only the data required for processing is transmitted.
Phase 4 - Processing by the AI model: The external model processes the request taking into account the submitted context. The model provider is contractually obliged to delete the data immediately after processing (zero retention policy). Use for training purposes is expressly prohibited.
Phase 5 - Response processing and quality control: The response generated by the model is automatically checked for potentially problematic content. The display in the user interface is then formatted.
Phase 6 — Storage and delivery: The complete conversation is stored in the database in encrypted form, recording both the original user request and the AI response. The response is then transmitted to the user's browser in TLS-encrypted form and displayed there.
Tool integration and advanced processing scenarios
Web search and external data sources: If the AI model needs up-to-date information for an appropriate answer, it can request that a web search be performed. This is done via isolated systems that do not allow any conclusions to be drawn about the requesting user or workspace. The search results are filtered and returned to the model anonymously.
Document processing (RAG system): Uploaded documents are processed by a retrieval-augmented generation system. The document content is converted into semantic vectors and stored in a secure vector database. These vectors make it possible to identify relevant document passages and integrate them into the context for subsequent inquiries. The original documents are stored in encrypted form.
Hybrid processing approaches: If an AI model offers native document processing functions, both the internal RAG system and the native functions can be used in parallel. This optimizes response quality and speed. The same data protection and security standards apply in all cases.
Storage, storage and deletion concepts
Active storage: Conversations are saved for continuous use by the user. Storage is AES-256 encrypted with client-specific key management. Accesses to stored data are completely logged.
Inactivity-based deletion: Conversations that aren't retrieved or continued for 180 days are considered inactive. These are automatically identified and irrevocably deleted.
Immediate deletion upon request: Users can delete specific conversations at any time and on their own. The entire account is deleted within 48 hours and includes all associated data and metadata.
Backup and recovery: Backups are created daily and are also stored in encrypted form. The storage period for backups is a maximum of 7 days, after which they are automatically cryptographically deleted by destroying the encryption keys.
Zero retention policy and contract drafting
The zero retention policy ensures that no user content is used for training purposes. This assurance is enshrined in all contracts with model providers:
Contract components: Explicit prohibition of using transmitted data for model training, obligation to delete data immediately after processing, regular audit rights, liability regulations for violations.
Technical enforcement: In addition to the contractual agreements, technical measures are implemented to prevent improper use. This includes data tagging, transfer logging, and automated compliance checks.
Monitoring and compliance: Regular reviews ensure that model providers comply with agreements. In the event of violations, defined escalation procedures apply up to the immediate termination of cooperation.
5. Governance and Partner Management
Incident response and reporting procedures
Detection: Automated monitoring systems continuously monitor system integrity and identify potential security incidents in real time.
Response: In the event of an incident, a defined escalation procedure applies. The incident response team is automatically notified and takes appropriate action.
Reporting requirements: Data breaches are reported to the relevant supervisory authority within 72 hours. Data subjects will be informed immediately if there is a high risk to their rights and freedoms.
Documentation: All incidents are fully documented, including cause, effects, actions taken, and lessons learned.
Contract processor and supply chain
Due diligence: Contract processors are selected based on established certifications, evidence of compliance and publicly available security documentation. Cloud providers use their industry-standard certifications (ISO 27001, SOC 2 Type II) as proof of quality.
Contract drafting: Order processing contracts contain the requirements required under Article 28 GDPR on security measures, deletion obligations, audit rights and sub-contract processing.
Third-country transfers: Transfers to AI providers outside the EU are made on the basis of standard contractual clauses with additional technical and organizational measures and under the EU-U.S. Data Privacy Framework. Customers are transparently informed about all third-country transfers.
Supply chain: Subcontractors are subject to the same compliance requirements and are notified accordingly when changes occur.
Quality assurance and certifications
Internal security guidelines: Documented information security and data protection guidelines govern all relevant processing processes and are regularly reviewed and updated.
Code reviews and security tests: All production-relevant code goes through multi-stage reviews with a focus on security and data protection. Regular safety tests validate the effectiveness of the implemented measures.
Infrastructure standards: The hosting infrastructure is based on established cloud providers with appropriate certifications (ISO 27001, SOC 2 Type II, etc.). As a result of order processing contracts, their standards are also binding for InnoGPT.
Continuous monitoring: Automated monitoring systems continuously monitor the system integrity, availability, and security of the platform.
Documentation and processes: All data protection and security-related processes are fully documented and are regularly reviewed for timeliness and effectiveness.
Training and awareness raising: All employees with access to personal data undergo regular data protection and security training.
6. Contact and Support
Contact and contact person
Data Protection Officer:
DataGap GmbH
Mr. Markus Altenburg
Bessemerstraße 82, 10th floor south
12103 Berlin
Telephone: 030 — 577 10 513
email: team@datagap.de
Website: www.datagap.de
General data protection issues:
email: datenschutz@inno-ki.de
Technical Support:
Technical questions about implementation and configuration: support@innogpt.de
Last updated: Aug 12, 2025




