The Symulator Wad Wzroku (SWW) is a research initiative developed at Politechnika Gdańska (Gdansk University of Technology). The project aims to create a simulation environment for visualizing various eye defects and impairments.
The system is composed of two primary applications:
- SWW-UnitySimulator: The main simulation engine built in Unity, responsible for rendering the virtual environment and applying visual impairment effects.
- SWW-ExternalController: A WPF (Windows Presentation Foundation) application that acts as a remote control for the simulator, allowing researchers to toggle and adjust impairments in real-time.
Politechnika Gdańska (Gdansk University of Technology)
- Faculty of Electronics, Telecommunications and Informatics
- Department of Intelligent Interactive Systems
- Immersive 3D Visualization Lab (LZWP)
The primary goal of the research is to evaluate the usability and realism of the VR application in simulating various visual impairments. The study aims to:
- Assess Realism: Verify if the simulation faithfully reproduces the perception of individuals with specific visual defects.
- Evaluate User Experience: Determine the degree of perceptual limitation felt by users without visual impairments.
- Analyze Task Difficulty: Measure how different simulated conditions affect the subjective difficulty of performing daily tasks.
- Validate Design Utility: Confirm if the simulator can serve as an effective tool for architectural design oriented towards accessibility.
- Does the VR application faithfully reproduce the perception of the world by people with selected visual impairments?
- To what extent do users (without visual impairments) feel perceptual limitations when using the simulator?
- Does the simulation of different eye conditions affect the subjective assessment of the difficulty of performing daily tasks in a virtual environment?
- Can the immersive spatial visualization simulator be effectively used as a tool to support the design of architectural space adapted to people with visual impairments?
Based on the source code analysis and project documentation, the simulator currently implements the following visual impairments:
These effects are applied to the camera's view to simulate global vision changes.
- Refractive & Focus Errors:
Farsighted,Shortsighted,DepthBlur,BlurVision - Color Vision Deficiencies:
ColorBlind_Deuteranopia,ColorBlind_Protanopia,ColorBlind_Tritanopia - Environmental & Light Sensitivity:
FoggyVision,NightVision,Desaturation,Bloom,Halo - Distortions & Disorientations:
Distortion,LineDistortion,DoubleVision,Dizziness,CameraShake
These effects are rendered using a sphere around the user to simulate localized field loss or obstructions.
- Field Loss:
RadialVignette(Tunnel Vision),DarkSpot - Obstructions:
Floaters - Legacy Effects:
(Legacy)BlindSpots,(Legacy)Floaters
The system operates on a Client-Server architecture designed for a CAVE (Cave Automatic Virtual Environment) setup.
- Server (Controller): The WPF application acts as the command center. It hosts a TCP server and broadcasts its presence via UDP.
- Client (Simulator): The Unity application runs on the CAVE cluster. It auto-discovers the server, connects via TCP, and executes commands.
- Discovery: UDP Broadcast on port 7777. Message:
SWW_ExternalController:<IP>:<PORT>. - Command Channel: TCP connection on port 41002.
- Data Format: Custom string-based protocol.
- Impairments:
VisualImpairments:Name,Strength;Name2,Strength2\n - Transform:
SphereRendererInitialTransform:X;Y;Z;...
- Impairments:
Built on Unity 2018.1.9f2, this component handles the immersive visualization.
The core of the simulation uses a Dual-Rendering Pipeline to support different types of vision defects:
- ScriptableObject Architecture: Each impairment is defined as a
VisualImpairmentSOasset. This allows for modular configuration of shaders, default strengths, and renderer types without recompiling code. - Camera Renderer (
VisualImpairmentsRenderer_Camera):- Handles Post-Processing Effects (e.g., Blur, Color Blindness, Distortions).
- Uses
OnRenderImageto intercept the render pipeline. - Dynamically stacks multiple shader passes using
Graphics.Blit. If multiple effects are active, it manages temporaryRenderTexturebuffers to chain them efficiently.
- Sphere Renderer (
VisualImpairmentsRenderer_Sphere):- Handles Localized Effects (e.g., Tunnel Vision, Floaters, Scotomas).
- Renders a physical mesh sphere surrounding the user.
- Manipulates the
MeshRenderermaterial array at runtime, layering impairment materials over the base material.
A structured system for conducting ADL (Activities of Daily Living) experiments:
- TaskManagerBehaviour: A Singleton manager that orchestrates the experiment flow. It synchronizes the state of the "Clipboard" (instructions) across the cluster using RPCs.
- Task Logic:
- Base
Taskclass defines the contract (isDone,getDescription). - Complex tasks like
WashHandsTaskutilize Trigger Zones and Flystick Tracking (checking for specific controller handles within a collider) to validate user actions over time.
- Base
- Cluster Synchronization:
- LZWPlib: A custom library used for synchronizing the CAVE cluster (Master/Slave nodes).
- NetworkView: Uses legacy RPCs (
RPCMode.All,RPCMode.OthersBuffered) to ensure all walls of the CAVE display the same task state and UI text.
A WPF (.NET) application built with the MVVM (Model-View-ViewModel) pattern for robust state management.
- TCP Server: Uses
TcpListenerto accept connections from the Unity Simulator. - UDP Broadcaster: Runs a background thread to announce the controller's IP address to the local network, enabling "Zero-Config" connections.
- Thread Safety: Marshals network events back to the UI thread for safe updates.
- State Management: Maintains an
ObservableCollection<VisualImpairment>representing the current simulation state. - Real-time Updates: Subscribes to
PropertyChangedevents on every impairment model. When a slider is moved, it immediately serializes the new state and transmits it to the simulator.
- JSON Persistence: Configurations are saved as JSON files in a
./presets/directory. - Manager:
PresetsManagerhandles CRUD operations, ensuring researchers can switch between complex impairment profiles (e.g., "Glaucoma Stage 3") instantly.
- Debug Tool: A built-in console (
DevConsoleManager) allows for direct command injection and debugging. - Features: Supports command suggestions and history, useful for testing the network protocol directly.
The External Controller allows for real-time manipulation of the simulation. It uses a TCP connection to send commands to the Unity Simulator.
- Discovery: The controller broadcasts its presence on UDP port 7777, allowing the simulator to automatically find and connect to it.
- Parameter Adjustment: Each impairment has adjustable parameters (e.g.,
EffectStrength) that can be modified on the fly.
The system supports saving and loading of Visual Impairment Presets. This allows researchers to define specific combinations of impairments (e.g., "Severe Myopia with Color Blindness") and quickly apply them during experiments.
The Unity Simulator includes a TaskManager to handle experimental tasks based on Activities of Daily Living (ADL).
- Task Workflow: Tasks are activated sequentially.
- Implemented Tasks:
- Meal Preparation: Sandwich assembly in the Kitchen.
- Personal Hygiene: Hand washing in the Bathroom.
- Consumer Electronics: Finding a remote and turning on the TV in the Bedroom.
- Extensibility: New tasks can be added by extending
TaskSetupandTaskclasses.
The simulation takes place in a virtual single-family home environment, consisting of 4 main rooms arranged in an inverted "T" layout:
- Hall: Central hub connecting all rooms.
- Kitchen: Fully equipped with appliances and interactive objects.
- Bathroom: Standard fixtures for hygiene tasks.
- Bedroom: Cozy environment with a bed and TV.
- Intro: A lobby scene for tutorials and initial setup.
The system is engineered for a specific high-end visualization environment.
- CAVE System: A multi-wall projection system driven by a cluster of rendering nodes.
- Tracking: 6-DOF tracking system for user head position and interaction devices ("Flystick").
- Network: Low-latency LAN connecting the rendering cluster and the control station.
- Runtime: Unity 2018.1.9f2 (Specific version required for
LZWPlibcompatibility). - OS: Windows 10/11 (Required for WPF External Controller).
- Dependencies:
LZWPlib: Proprietary library for cluster synchronization and projection mapping..NET Framework: Required for the External Controller.
Project documentation is maintained internally by the research team.
- Systematic Literature Review: A comprehensive review of existing VR visual impairment simulations has been conducted to inform the design.
- Research Article: A paper detailing the system architecture, implementation challenges, and validation results is currently in preparation.
Supervisor: dr inż. Jacek Lebiedź
Research Team:
- Adam Cherek
- Mikołaj Bisewski
- Karolina Zaborowska
- Barbara Badziąg
- Marcin Chętnik
- Bogumiła Merc
- Tomasz Pietrowski
Property of Politechnika Gdańska (Gdansk University of Technology) and the reseach team.
This is a closed-source research project. The code and assets are proprietary. Access to the source code, binaries, or use of the simulator requires explicit permission and contact with the university representatives.
- Politechnika Gdańska for providing research facilities
- LZWP Team for guidance and great atmosphere