
# Import & Export

The Import & Export feature allows you to easily import and export data from your project types in JSON or CSV format.

## Overview

Located at `Settings > Import & Export`, this module provides a step-by-step workflow for both importing and exporting data with support for data type conversion and field mapping.

## Features

### Export
- **JSON Format**: Export data as structured JSON arrays
- **CSV Format**: Export data as comma-separated values
- **Filtering**: Use DataView filters to export specific subsets of data
- **Column Selection**: Choose which properties to include in the export

### Import
- **JSON Format**: Import structured JSON data
- **CSV Format**: Import CSV files with automatic column detection
- **Data Preview**: View the first 10 rows before importing
- **Property Mapping**: Map source columns to target properties
- **Data Converters**: Transform data during import with built-in converters:
  - Text to Number
  - Text to Boolean
  - Date to Timestamp
  - Timestamp to Date
  - Parse JSON
  - Stringify JSON
- **Batch Import**: Import large datasets efficiently
- **Error Reporting**: See detailed error messages for failed imports

## Usage

### Exporting Data

1. Navigate to `Settings > Import & Export`
2. Select the data type you want to export
3. Choose "Export" as the direction
4. Select your preferred format (JSON or CSV)
5. Click "Export Data" to download the file

You can also access Import & Export directly from a DataView widget by clicking the import/export button in the action row.

### Importing Data

1. Navigate to `Settings > Import & Export`
2. Select the target data type
3. Choose "Import" as the direction
4. Select the format of your file (JSON or CSV)
5. Upload your file
6. Click "Preview" to see the first few rows
7. Map source columns to target properties
8. (Optional) Apply data converters for type transformation
9. Click "Import Data" to start the import

The import process will show you:
- Total number of rows processed
- Number of successful imports
- Number of failed imports
- Detailed error messages for failures

## API Endpoints

### Export Data
```
POST /api/projects/{projectId}/import-export/export
```

Request body:
```json
{
  "typeId": "string",
  "format": "json" | "csv",
  "filter": {},
  "columns": ["field1", "field2"]
}
```

Response:
- Returns the exported data as JSON array or CSV file
- File is served as a downloadable attachment with timestamp

### Import Data (Preview & Execute)
```
POST /api/projects/{projectId}/import-export/import
```

Request body:
```json
{
  "typeId": "string",
  "format": "json" | "csv",
  "data": "string",
  "mappings": {
    "sourceField": "targetField"
  },
  "converters": {
    "sourceField": "textToNumber"
  }
}
```

Response:
```json
{
  "total": 100,
  "imported": 95,
  "failed": 5,
  "errors": ["Row 2: Invalid email format", "..."]
}
```

**Note**: The endpoint combines preview and import operations. The client performs the preview locally before sending the import request.

## Smart Import (AI-Powered)

Smart Import uses AI to extract structured data from websites, files, or images and map it to your data types automatically.

### Features

- **From Website**: Paste a URL and the AI extracts data from the page content
- **From File**: Upload a PDF, TXT, CSV, or JSON file for AI extraction
- **From Image**: Upload an image or take a photo (mobile camera supported) for AI-based data recognition
- **Related Entities**: Automatically detects and extracts related data types from the same source
- **Preview & Edit**: Review and modify extracted data before saving

### Usage

1. Open any DataView and click the **Smart Import** button (wand icon) in the action bar
2. Choose a source type: URL, File, or Image
3. Provide the source data (paste URL, upload file, or upload/capture image)
4. Click **Extract data** to start AI extraction
5. Review the extracted data in the editor — modify fields as needed
6. Optionally select/deselect related entities found in the source
7. Click **Save** to create the entity and any selected related entities with relations

### API Endpoint

```
POST /api/projects/{projectId}/types/{typeId}/ai-extract
```

**Authorization**: Requires `RLS_UPDATE_PERMISSION` on the project.

Request body:
```json
{
  "source": "url" | "file" | "image",
  "url": "https://example.com/page",
  "content": "<base64-encoded file content>",
  "imageBase64": "<base64-encoded image>",
  "fileName": "document.pdf",
  "mimeType": "application/pdf"
}
```

| Field | Required | Description |
|-------|----------|-------------|
| `source` | Yes | Source type: `"url"`, `"file"`, or `"image"` |
| `url` | When source=url | URL to fetch and extract data from |
| `content` | When source=file | Base64-encoded file content (PDF, TXT, CSV, JSON) |
| `imageBase64` | When source=image | Base64-encoded image data |
| `fileName` | No | Original file name (used for format detection) |
| `mimeType` | No | MIME type of the file or image |

Response:
```json
{
  "primary": { "title": { "de": "Extracted title" }, "field": "value" },
  "related": [
    { "typeId": "contacts", "typeName": "Contact", "data": { "name": { "de": "John" } } }
  ],
  "sourceWarning": "smartImport.error.jsWarning"
}
```

**Size limits**: Maximum 10 MB for file content and image data. PDF parsing is limited to the first 10 pages. Text content is truncated to 30,000 characters before AI processing.

**SSRF protection**: The URL source validates that the target host is public (blocks private/loopback IPs and non-HTTP protocols).

## Best Practices

1. **Always preview before importing**: Check that your data looks correct before executing the import
2. **Use appropriate converters**: Apply data converters to ensure data types match your schema
3. **Map all required fields**: Make sure all required properties have mappings
4. **Export before modifying**: Always export your current data before making bulk changes
5. **Test with small datasets**: When importing for the first time, test with a small file to verify the mappings
6. **Handle errors**: Review the error report after import to identify and fix any issues

## Limitations

- CSV files must be properly formatted with headers in the first row
- JSON files must be valid JSON arrays or objects
- Large imports may take several minutes depending on file size
- Import errors are limited to the first 100 messages in the UI

## Security

- Import/Export requires appropriate project and type permissions
- **Export**: Users must have READ permission for the target type
- **Import**: Users must have CREATE permission for the target type and must be able to create documents
- Type-level access control is enforced through `creationAccess` rules
- System types bypass creation access checks
- All operations are logged for audit purposes
