Skip to main content

Quick Setup Guide

Get Eaternity Forecast running in your kitchen in four straightforward steps. This guide covers the setup process from initial contact to receiving your first predictions.

Overview

Timeline

PhaseDurationKey Activities
Setup1-2 weeksIntegration, data import, configuration
Training1-2 weeksNeural network learns your patterns
Testing1 weekValidate predictions, gather feedback
ProductionOngoingDaily predictions for planning

Total time to first predictions: 2-4 weeks depending on integration method

Prerequisites

Before starting, ensure you have:

  • ✅ Identified your POS/ERP system and data access method
  • ✅ Designated a team contact person
  • ✅ Confirmed minimum 30 days of historical sales data available
  • ✅ Contacted Eaternity to begin onboarding

Step 1: System Integration

Choose your integration method based on your technical setup.

Option A: Necta Integration (Fastest)

Best for: Existing Necta customers

Setup Process:

  1. Contact Necta Account Manager

    Email subject: "Activate Eaternity Forecast Integration"
    Include: Your company name and Necta account ID
  2. Eaternity Configuration

    • Our team receives notification from Necta
    • We configure the connection (no action required from you)
    • Historical data automatically imported from Necta database
  3. Verification

    • Receive confirmation email when connection is active
    • Log into Necta to verify Forecast module appears
    • Historical data import status visible in dashboard

Timeline: 3-5 business days

Option B: Direct API Integration

Best for: Custom POS/ERP systems with technical resources

Setup Process:

  1. Review API Documentation

  2. Implement Data Endpoints

    Create endpoints for:

    Sales Data Export (Required):

    POST /api/forecast/sales
    {
    "date": "2024-01-15",
    "items": [
    {
    "name": "Pasta Carbonara",
    "quantity": 45,
    "service_period": "lunch",
    "category": "Main Course"
    }
    ]
    }

    Historical Data Import (Initial setup):

    POST /api/forecast/sales/bulk
    {
    "start_date": "2023-10-01",
    "end_date": "2024-01-15",
    "items": [...]
    }
  3. Authentication Setup

    Coordinate with Eaternity team:

    • Receive API credentials
    • Configure OAuth 2.0 or API key authentication
    • Test connection with sandbox environment
  4. Testing

    Validate integration:

    # Test authentication
    curl -X POST https://api.eaternity.org/v1/forecast/auth \
    -H "Content-Type: application/json" \
    -d '{"api_key": "your_api_key"}'

    # Test sales data submission
    curl -X POST https://api.eaternity.org/v1/forecast/sales \
    -H "Authorization: Bearer your_token" \
    -H "Content-Type: application/json" \
    -d '{"date": "2024-01-15", "items": [...]}'
  5. Historical Data Import

    Bulk import your historical data:

    • Export sales data from your POS/ERP (CSV, JSON, or Excel)
    • Transform to required format using provided scripts
    • Submit via bulk import endpoint
    • Monitor import progress in dashboard

Timeline: 2-4 weeks depending on complexity

Option C: Manual Upload

Best for: Initial setup phase or smaller operations

Setup Process:

  1. Download Data Template

    Request template from Eaternity support:

    • Excel spreadsheet with required fields
    • Sample data for reference
    • Validation formulas to check data quality
  2. Export Sales Data from POS

    Extract historical data:

    • Minimum 30 days (90+ days recommended)
    • Item-level quantities, not just revenue
    • Date stamps for each transaction
  3. Format Data

    Required columns:

    date | item_name | quantity_sold | service_period | price | category

    Example:

    2024-01-15,Pasta Carbonara,45,lunch,14.50,Main Course
    2024-01-15,Caesar Salad,32,lunch,9.00,Starter
    2024-01-15,Grilled Salmon,28,lunch,18.50,Main Course
  4. Upload to Portal

    • Access secure upload portal (link provided by coordinator)
    • Upload formatted CSV/Excel file
    • Verify data preview before confirming
    • Receive confirmation email when processing complete
  5. Set Up Recurring Uploads

    For ongoing predictions:

    • Weekly upload schedule (Monday recommended)
    • Export previous week's sales data
    • Upload via portal or SFTP
    • 15-30 minutes per week

Timeline: 1 week for initial setup

Step 2: Historical Data Import

Data Preparation

Verify Data Quality:

Run these checks before importing:

Completeness Check:

  • No missing dates in the range
  • All menu items tracked consistently
  • Service periods clearly labeled

Accuracy Check:

  • Quantities match actual portions served
  • Date stamps are correct (watch for timezone issues)
  • No duplicate entries for same item/date

Consistency Check:

  • Same item names across all dates
  • Standardized category names
  • Consistent service period labels

Example Quality Check:

import pandas as pd

# Load your data
df = pd.read_csv('sales_data.csv')

# Check for missing dates
date_range = pd.date_range(start=df['date'].min(), end=df['date'].max())
missing_dates = date_range.difference(pd.to_datetime(df['date']))
print(f"Missing dates: {missing_dates}")

# Check for inconsistent item names
item_variations = df.groupby('item_name')['item_name'].count()
print(f"Total unique items: {len(item_variations)}")

# Check for duplicate entries
duplicates = df[df.duplicated(['date', 'item_name', 'service_period'])]
print(f"Duplicate entries: {len(duplicates)}")

Import Process

  1. Submit Historical Data

    Via your chosen method:

    • Necta: Automatic import from existing data
    • API: Bulk import endpoint
    • Manual: Upload portal
  2. Data Validation

    Eaternity team reviews:

    • Data format compliance
    • Quality metrics
    • Completeness assessment
    • Any anomalies or issues
  3. Receive Validation Report

    Within 2 business days:

    • Data quality score
    • Issues found and recommendations
    • Approval to proceed or requests for corrections
  4. Corrections (if needed)

    Address any issues:

    • Reformat data according to feedback
    • Fill in missing information
    • Resolve inconsistencies
    • Resubmit for validation

Expected Data Volume

Minimum for Basic Training:

  • 30 days of historical data
  • All menu items tracked
  • At least 50 covers/day average

Recommended for Optimal Training:

  • 90+ days of historical data
  • Seasonal variation represented
  • Special events and holidays included

Ideal for Advanced Accuracy:

  • 180+ days (6 months)
  • Full seasonal cycle
  • Weather data available
  • Event calendar included

Step 3: Model Training

Training Process

Once historical data is imported, neural network training begins automatically.

Phase 1: Initial Pattern Recognition (Days 1-3)

The model learns:

  • Basic daily patterns
  • Item popularity trends
  • Service period differences
  • Day-of-week variations

Phase 2: Advanced Feature Learning (Days 4-7)

The model identifies:

  • Weekly and monthly cycles
  • Seasonal trends (if sufficient data)
  • Weather correlations
  • Event impact patterns

Phase 3: Optimization (Days 8-14)

The model refines:

  • Prediction accuracy
  • Confidence interval calibration
  • Outlier handling
  • Menu change adaptation

Training Monitoring

Progress Dashboard:

Access training status via:

  • Email updates (daily summary)
  • Dashboard interface (real-time)
  • Slack notifications (optional)

Key Metrics Displayed:

  • Training progress percentage
  • Current accuracy on validation set
  • Expected completion date
  • Any issues or warnings

Example Training Report:

Training Progress: 65% complete
Current MAPE: 18.2% (target: less than 15%)
Items trained: 42/65
Expected completion: 2024-01-25
Status: On track

What Happens During Training

You don't need to do anything, but understand what's happening:

  1. Data Preprocessing

    • Normalization of quantities
    • Feature extraction (day of week, seasonality, trends)
    • Weather data integration
    • Event calendar alignment
  2. Model Architecture Setup

    • Transformer layers configured
    • Attention mechanisms initialized
    • Temporal encoding established
    • Multi-layer processing prepared
  3. Training Iterations

    • Model learns from historical patterns
    • Validation against held-out data
    • Hyperparameter optimization
    • Regularization to prevent overfitting
  4. Accuracy Validation

    • Comparison to human forecaster baseline
    • Confidence interval calibration
    • Error analysis and pattern identification
    • Final model selection

Step 4: Start Forecasting

First Predictions

Timeline: 2-4 weeks after setup begins

Notification:

  • Email alert when first predictions are ready
  • Dashboard shows "Active" status
  • Predictions available via API or interface

Initial Prediction Set:

  • Next 7 days forecasted
  • All active menu items included
  • Confidence intervals for each prediction
  • Historical accuracy metrics displayed

Accessing Predictions

Via Necta Interface (Necta customers):

  1. Log into Necta planning module
  2. Navigate to "Demand Forecast" section
  3. View daily predictions by item
  4. Export to planning worksheets

Via API (Custom integrations):

# Get predictions for specific date
curl -X GET "https://api.eaternity.org/v1/forecast/predictions?date=2024-01-20" \
-H "Authorization: Bearer your_token"

# Response
{
"date": "2024-01-20",
"day_of_week": "Saturday",
"predictions": [
{
"item_name": "Pasta Carbonara",
"predicted_quantity": 52,
"confidence_interval": {
"lower": 45,
"upper": 59
},
"accuracy_last_30_days": 92.3
}
]
}

Via Dashboard (Manual access):

  1. Log into Forecast dashboard
  2. Select date range
  3. View predictions table
  4. Download CSV export

Understanding Your First Predictions

Prediction Components:

Each forecast includes:

  1. Predicted Quantity: Most likely number of portions
  2. Confidence Interval: Range of expected demand (lower to upper bound)
  3. Accuracy Metric: How reliable predictions have been recently
  4. Factors: Key drivers (weather, day of week, events)

Example Prediction:

Item: Pasta Carbonara
Date: Saturday, January 20, 2024
Predicted Quantity: 52 portions

Confidence Interval: 45-59 portions
- Lower bound (10th percentile): 45
- Upper bound (90th percentile): 59
- Confidence level: 80%

Historical Accuracy: 92.3% (last 30 days)

Key Factors:
- Weekend (Saturday): +20% vs weekday average
- Temperature: 8°C (normal winter demand)
- No special events detected

Learn more about confidence intervals →

Daily Workflow Integration

Recommended Process:

  1. Morning Review (5-10 minutes)

    • Check today's final sales vs yesterday's prediction
    • Review tomorrow's forecast
    • Note any surprising variances
  2. Planning (15-20 minutes)

    • Use predictions for ingredient ordering
    • Adjust prep quantities based on forecasts
    • Consider confidence intervals for buffer planning
  3. Feedback (optional, 2-3 minutes)

    • Note any missed factors (unexpected events, weather changes)
    • Report prediction errors >30% to help improve model
    • Submit feedback via dashboard or email

See Implementation Guide for detailed workflow →

Validation and Testing Phase

Week 1: Observation Mode

Goal: Understand how predictions compare to your current forecasting

Activities:

  • Review daily predictions but don't change current process yet
  • Compare Forecast predictions to your existing forecasts
  • Note any patterns or surprises
  • Track prediction accuracy

Metrics to Track:

| Item          | Actual | Your Forecast | AI Forecast | Your Error | AI Error |
|---------------|--------|---------------|-------------|------------|----------|
| Pasta Carb. | 48 | 55 | 52 | +14.6% | +8.3% |
| Caesar Salad | 30 | 28 | 31 | -6.7% | +3.3% |

Week 2: Hybrid Approach

Goal: Start incorporating predictions into planning

Activities:

  • Use predictions for 25-50% of menu items
  • Keep manual forecasting for high-stakes items initially
  • Compare results between manual and AI forecasts
  • Build confidence in prediction accuracy

Team Training:

  • Review confidence intervals with kitchen staff
  • Discuss how to handle upper/lower bounds
  • Practice adjusting for known factors not in data

Week 3-4: Full Deployment

Goal: Use predictions for all menu items

Activities:

  • Rely on predictions for daily planning
  • Use confidence intervals for buffer decisions
  • Track actual waste reduction
  • Calculate cost savings

Success Indicators:

  • Reduced overproduction
  • Maintained service quality (no stock-outs)
  • Time saved on manual forecasting
  • Team confidence in using system

Troubleshooting Setup Issues

IssueQuick Fix
Data import failsCheck UTF-8 encoding, YYYY-MM-DD dates, no blank rows
API authentication failsVerify API key has no extra spaces, use HTTPS
Predictions seem inaccurateEnsure 30+ days of data, allow 2 weeks training
Necta integration not appearingClear cache, verify module activation with Necta

For detailed solutions, see Integration Troubleshooting.

Getting Help

Email: forecast@eaternity.org

Issue TypeResponse Time
Critical (system down)4 hours
Integration problems24 hours
Data/feature questions48 hours - 1 week

Checklist: Setup Complete

Integration

  • POS/ERP connection established
  • Authentication configured and tested
  • Data flow verified

Historical Data

  • Minimum 30 days imported
  • Data validation passed
  • Quality score >80%

Training

  • Neural network training completed (100%)
  • Validation accuracy meets targets
  • All menu items trained

Predictions

  • First predictions received
  • Team can access via interface or API
  • Confidence intervals understood

Team Readiness

  • Contact person trained
  • Kitchen staff briefed on using predictions
  • Workflow integration planned
  • Feedback process established

Next Steps

Once setup is complete:

  1. Begin Daily Use

    • Incorporate predictions into planning workflow
    • Track accuracy and food waste reduction
    • Report any issues or unexpected results
  2. Provide Feedback

    • Schedule first monthly check-in call
    • Share early observations and questions
    • Suggest improvements or feature requests
  3. Optimize Usage

  4. Monitor Performance

    • Track cost savings from waste reduction
    • Measure time saved vs manual forecasting
    • Document success stories for case study

See Also