13.07.2015 Views

Advanced Driver Assistance System Testing Using OptiX - GPU ...

Advanced Driver Assistance System Testing Using OptiX - GPU ...

Advanced Driver Assistance System Testing Using OptiX - GPU ...

SHOW MORE
SHOW LESS
  • No tags were found...

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

<strong>Advanced</strong> <strong>Driver</strong> <strong>Assistance</strong> <strong>System</strong><strong>Testing</strong> using <strong>OptiX</strong>NVIDIA GTC 2012, San Jose


<strong>Advanced</strong> <strong>Driver</strong> <strong>Assistance</strong> <strong>System</strong> <strong>Testing</strong> using <strong>OptiX</strong>Erwin RothRobotics and Embedded <strong>System</strong>s, Technischen Universität MünchenTugkan Calapoglu, Holger HelmichVires Simulationstechnologie GmbH, Bad AiblingKilian v. Neumann-CoselAutomotive Safety Technologies GmbH, IngolstadtMarc-Oliver FischerDepartment for Vehicle Safety Sensor Algorithms, AUDI AG, IngolstadtAndreas KernAudi Electronics Venture GmbH, IngolstadtAndreas Heider, Alois KnollRobotics and Embedded <strong>System</strong>s, Technische Universität München2E. Roth, T. Calapoglu, et al.


Agenda► Motivation, requirements and goals► Integration of <strong>OptiX</strong> into the Vires Virtual Test Drive simulation software► <strong>Advanced</strong> material and emitter data descriptions► Example sensor model implementations► Model validation and verification process► Summary and Outlook3E. Roth, T. Calapoglu, et al.


MotivationGrowing challenges for testing new <strong>Advanced</strong> <strong>Driver</strong> <strong>Assistance</strong> <strong>System</strong>s (ADAS)►►►Increased number of comfort-, energy-management- and safety-related functionsGrowing dependency of ADAS-functions on multiple perception sensorsDifficulties to record reproducible sensor data for real world scenariosForecast: 300% increase in shipped ADAS units [Mio.]30201002012 2013 2014 2015 2016 2017Source: iSuppli4E. Roth, T. Calapoglu, et al.


Main Objectives• Support ADAS testing with computer simulations for realistic multi-sensor data computation• Validated sensor models as parts of anintegrated vehicle and environmentsimulation system• Enable closed-loop simulations in Hardware- and Software-in-the-loop testbeds• Reproducibility of test scenarios for a wide range of environment and traffic conditions• Early evaluation of new sensor concepts and ADAS functions• Increased test space coverage by combining real and virtual test drives5E. Roth, T. Calapoglu, et al.


Multi-Sensor Simulation Environment Objectives• Simultaneous execution of multiple perceptionsensor emulators with realistic distortion effects• Share a common simulation infrastructurefor sensor data consistency• Scenario description• Object, material and emitter (light) data• Communication• Configuration6E. Roth, T. Calapoglu, et al.


Sensor Emulator Requirements - Architecture• Support the same data formats and interfaces as the real sensorSimplified integration into the existing ADAS development and testing process7E. Roth, T. Calapoglu, et al.


Sensor Emulator Requirements – Simulation Variants• Support various simulation types in the existing ADAS test toolchain+ADTF®(Automotive Data and TimeTriggered Framework)+MATLAB® &Simulink®Toolchain for the integrated development and testing of ADAS functionsSoftware-in-the-loop<strong>Driver</strong>-in-the-loopVehicle-in-the-loopHardware-in-the-loopvirtuellrealBock, EF-56slower / fasterthan real-timesoft real-timehard real-timehard real-time8E. Roth, T. Calapoglu, et al.


Sensor Emulator Requirements – Level of Realism• Extensible architecture regarding refined modelsfor a higher level of realism• Configurable approximation accuracy and distortion levelswith a single consistent modelSensor Emulation Data AccuracyidealAlgo. validation(SIL, faster than real-time)‘functional realism’ADAS system testing(HIL, hard-real-time)‘close to reality’Operational-envelopetesting (SIL, non-real-time)9E. Roth, T. Calapoglu, et al.


Sensor Emulator Requirements – Physics• Particle-, ray- and wave-based physicalmeasurement methods shall be approximated• Physics-oriented modeling of• sensor data acquisition process• related systematic and stochastic distortion effects• material, surface and emitter properties[Source: Keller, Kolb]EmitterSignal propagationReceiver10E. Roth, T. Calapoglu, et al.


<strong>Advanced</strong> Material and Emitter DescriptionMulti-modal sensor simulation requires extendedmaterial and light source (emitter) descriptions• Each material is identified by a unique ID• Covering also non-visible light spectrum, e.g. 300 – 1000nm• Holding meta data for material/emitter classification, lookup, …• Storage of physical properties in form of scalars and textures• Support for existing measurement techniques andmaterial standards incl. accuracy informationUnique Material IDParent Material ID = AlloyMeta Data•Description = „Alloy 85‟•Class: Metal•Surface subclass: Coated•Measurement standard and accuracy infoPhysical Parameters [Si-Units]•Elec. Conductivity = 36.59 *10^6 S/m•Surface roughness = 120 umDiffuse Reflection SpectrumOpenGL Fallback ColorsSoftware specific shaders / RT programsMeasured BRDF/BSDF/BTFfor visible, infrared, … spectrum• Material data records must be extensibleExtended data11E. Roth, T. Calapoglu, et al.


12Integration of <strong>OptiX</strong> and VTD


Open Interfaces3rd PartyComponentsVirtual Test DriveMockupScenarioEditorRoadDesignerRODTrafficConfigurationSound3D rendererv-IGTask and DataControlMockupOperatorStation<strong>System</strong>sModuleManagerVehicleDynamics<strong>Driver</strong>Model<strong>System</strong>sDynamicsSensors• VTD core = simulation environment• 3D rendering (image generator)• traffic and scenario• sound• mockup interfaces• record/playback• event and data handling• content creation• management of custom modules• VTD dev = development environment• interfaces for•run-time data (Run-time Data Bus – RDB)•event / control data (Simulation ControlProtocol – SCP)•sensor development(using Image Generator v-IG)• module development via• library• C++ API13


VTD – Content Creation ToolchainContent CreationIOS(Project Edit Mode)ScenarioEditorROD(Road Designer)ProjectScenarioDescriptionRoad logic andGeometryVisualDatabaseRuntimeIOSTrafficVehicleDynamicsv-IG(3D renderer)Content dataTaskControlReal-time data14


v-IG• Open Scene Graph (and OpenGL) based 3Drenderer• Part of VTD but also available asa standalone renderer• Provides an API• Used in driving, train and flight simulators• Used in sensor simulation applicationsStandard day scene (OpenGL)Sensor image for hardware-in-the-loop simulator (OpenGL)HDR night scene with wet road (OpenGL)15


<strong>OptiX</strong> plug-in• Conversion of OSG scene to <strong>OptiX</strong> scene– Geometry, Materials• Synchronizing OSG/<strong>OptiX</strong> scenes– Animations, LOD, Lights• Post Processing• Real-time data transfer• C++ API provided for customization• New camera models withCuda/C++• Different buffer formats,multiple output buffers• Custom light sources• Building post processingpipelines• Adding/editing materials16


Creating the <strong>OptiX</strong> node graph• v-IG loads the scene and creates an OSG scene graph• <strong>OptiX</strong> plug-in translates the OSG scene graph to an <strong>OptiX</strong> scene graph• <strong>OptiX</strong> specific optimizations during translation• Some objects (e.g. Vehicles) loaded and deleted in run-timeMaterialsTerrain,3D modelsv-IGOSG scene graph<strong>OptiX</strong> plug-in translatesthe scene graph to<strong>OptiX</strong> node graph<strong>OptiX</strong> node graph17


Material management• XML material definitions• Grouped according towavelength and/or sensortype• Associates materials withobjects• Grouped according towavelength and/or sensortype• Identified by textures orID„s• v-IG assigns the materialsto OSG scene graph nodes• <strong>OptiX</strong> plug-in creates <strong>OptiX</strong>materials and puts into thematerial bufferMaterialsIRAssignmentsIRScene ObjectsRadarRadar18


Material Management• Material definitions in XML• Common materials for rasterizer andray-tracer– Shader params will be GLSLuniforms in rasterizer and <strong>OptiX</strong>variables in ray-tracer– Rasterizer loads GLSL programs,<strong>OptiX</strong> ptx files• New materials can be derived fromexisting onesSample material declerationCopy and override material properties19


Synchronization of scenes• v-IG manipulates OSG scene graph nodes (e.g. DOF´s) for animations• LOD nodes are automatically updated by OSG• <strong>OptiX</strong> plug-in monitors function nodes and synchronizes their <strong>OptiX</strong>counterpartsv-IGSimulationAnimation dataover networkAnimationroutinesv-IGmanipulatesthe scenegraph toanimate theobjectsOSG scene graph<strong>OptiX</strong> plug-insynchronizes thegraphscontinuously<strong>OptiX</strong> node graph• Cameraposition• Positions andorientation ofobjects• Light positions• Etc.20


Lights and emitters• v-IG creates and manages a list of lights– Some lights are generated automatically with the informationstored in the terrain and models (car headlights, street lamps)– Communication protocol allows for creating and controlling lightsexternally• <strong>OptiX</strong> plug-in synchronizes <strong>OptiX</strong> light buffer with v-IG lights• Lights can represent any type of emitterSimulationAnimation dataover network• Positions andorientation ofobjects• Light positionsv-IGLightcontrolObjectanimationroutinesUpdate lights thatare created andmanaged byexternalapplicationsUpdate lights thatare parts ofsimulation entities(e.g. Carheadlights)v-IGLights<strong>OptiX</strong> plug-insynchronizes thelists continuously<strong>OptiX</strong> light buffer21


Post processing• Rendered buffers can be fed into v-IG postprocessing pipeline– Programmable through API– Post processing with GLSL shaders– 32 bit floating point• Motivations– Bloom– Noise– Tone mapping– Etc.Tone mapping and blooming<strong>OptiX</strong> plug-in32-bit float<strong>OptiX</strong> outputbuffer32-bit floatOpenGLtexture...32-bit floatOpenGLtexturePost processing step 1Post processing step n22


Real-time data transfer• Rendered buffers are made available to external applications• Shared memory or network• Producing data for hardware-in-the-loop and software-in-the-loop simulationsRAMv-IG + <strong>OptiX</strong>VideoRAM32-bit float<strong>OptiX</strong> outputbufferOtherworkstation(s)orsensor hardwareNetworkSharedMemory32-bit OpenGLtexture...32-bit OpenGLtexturePost processingUser application23


Sensor Model Examplesusing <strong>OptiX</strong> + VTD24


Photonic Mixing Device (PMD) Sensor Model• The PMD-sensor uses the Time-of-Flight principlefor measuring intensity and depth data of a 3D-scenewith modulated infrared (IR) light• Important systematic and stochastic distortion effects• non-ambiguity range• extraneous light sensitivity• “flying pixels”, motion blur[Source: Keller, Kolb]• Three-step sensor data simulationcolorized depth-image25E. Roth, T. Calapoglu, et al.


PMD Sensor Model• Approximation techniques in the current PMD sensor emulator• Multiple rays per pixel with stratified sampling• Simulate the angle-dependent emission characteristics ofthe modulated IR-light source based on Radiometry measurements• Phong reflection model in combination withmeasured IR-material reflection values26 E. Roth, T. Calapoglu, et al.


Ultrasonic Sensor Model• Currently: ideal acoustic wave propagation model• Modeling requirements• Computation of primary-, secondary and cross-echo• Efficient computation of up to 20 ultrasonic sensorson a single <strong>GPU</strong>• Consideration of target object material class (e.g. vegetation)Test scene in MATLAB16 circularly-arranged ultrasonic sensor „depth maps“ simultaneously rendered with <strong>OptiX</strong>27E. Roth, T. Calapoglu, et al.


Sensor Emulator Validation and Verification StagesReal Sensor <strong>System</strong>• Sensor test benchRaw dataProcessingRaw dataGenerationObjectificationAlgorithms<strong>OptiX</strong> + ADTFSensor EmulatorObjectificationAlgorithmsMATLAB + ADTFSensor Emulator• Real test drive• Issue recordings DBReal-time capable• x-in-the-loop variantsNon-real-time,high fidelity2. Verification<strong>Testing</strong> withreal sensor data1. ValidationPrototypical<strong>Testing</strong>Comparisonbased on referencedata & virtualscene generationComparisonbased on static + dyn.description of„analytical scenes“using XMLRaw dataGenerationObjectificationAlgorithms• MATLAB ray-tracer forraw data generation28E. Roth, T. Calapoglu, et al.


Shared Memory or NetworkSensor Emulator Validation and Verification ToolchainMATLABstate dataSimulinkoffline data comparisonSensor-Pluginv-IG<strong>OptiX</strong> - APIVehicle DynamicsObjects MaterialTraffic …raw data• Master Control Unit• Closed-loop ModelADTFSensor Objectification AlgorithmsSensor Objectification AlgorithmsADAS AlgorithmsADAS AlgorithmsReal sensor data recordingsincl. reference- and scenario-dataactuatordatascenario#1: per frame datarealscenario#N: per frame datarealsynth.synth.groundtruthgroundtruth29 E. Roth, T. Calapoglu, et al.


Summary►We showed our approach for supporting ADAS algorithm and function testingby using Virtual Test Drive and <strong>OptiX</strong> for multi-sensor data simulation►Related requirements and implemented concepts for realistic multi-sensor simulation►►►►Physics-oriented sensor modeling using <strong>OptiX</strong>A common sensor-model simulation infrastructure<strong>Advanced</strong> material and emitter specificationsValidation and verification process►Ray-tracing with <strong>OptiX</strong> seems to be a reasonableplatform. However, we are just at the beginning …AUDI ADAS Demonstrator30E. Roth, T. Calapoglu, et al.


OutlookChallenges to be tackled in the future regarding …►A standard for multi-spectral material and emitter specifications►►►Simulation software independent description and identification schemePhysical property handling of materials and emitters, e.g. for wavelengths 300 – 1000nmSupport for different measurement data formats and standards►<strong>OptiX</strong>►►►Support for large scenarios (1000x of objects, 100x materials, 10x sensor models, …)Improved multi-<strong>GPU</strong> scalability for 60 Hz and higherImproved <strong>OptiX</strong> debugging, profiling and optimization tools31 E. Roth, T. Calapoglu, et al.


Thank you very much.32Vielen Dank.Contact:E. Roth, T. Calapoglu, et al.Get in touch with us, if you arealso using <strong>OptiX</strong> for sensor simulation!Erwin Roth, Technische Universität Münchenerwin.roth@in.tum.deTugkan Calapoglu, Vires Simulationstechnologie GmbHtugkan@vires.com


BACKUP33 E. Roth, T. Calapoglu, et al.


Why was NVIDIA <strong>OptiX</strong> selected?• Since Intel‟s Larrabee was never released ;-)• Programmability and Flexibility of <strong>OptiX</strong>‟s Ray Tracing pipeline• Customizable Ray Tracing pipeline• Focus on mathematical model rather than 3D programming• Many core, multi-<strong>GPU</strong> scalability• Availability for different platforms[Source: NVIDIA]• AUDI was already using the Virtual Test Drive (VTD) simulation system• We decided to extend the OpenSceneGraph-based 3D-renderer of VTD with an <strong>OptiX</strong>-plugin• Allows us to reuse most of the existing rendering and simulation infrastructure34E. Roth, T. Calapoglu, et al.


XML-Scene Data Interchange Format• File format for <strong>OptiX</strong> node graphs– XML based format– Easily readable and editable– C++ library for loading/saving– Can be integrated to othersoftware• Motivations– Sensor model validation– Debugging35 E. Roth, T. Calapoglu, et al.


MaterialtranslatorMaterialtranslatorIntegrated Material HandlingCAD <strong>System</strong>• Design of geometry• Initial materialassignmentMaterialtranslatorGeometryData+GlobalUniqueMaterial ID(GUMID)PDM <strong>System</strong>• Central datamanagementOrganization-wideManufacturingHigh End VisualizationRTTDeltagen, Maya, …<strong>Advanced</strong> Material +Emitter DBwith GUMIDSimulationVTD, FEM-crash, -thermo, …<strong>Advanced</strong> MaterialReference DBwith GUMIDGUMID concept based onGerd Sussner, RTT AGMaterialMeasurement Labs36E. Roth, T. Calapoglu, et al.


Multi-Sensor Simulation Material pre-processing PipelineGeometrydata incl.geometry IDsLevel:Sim.EnvironmentMAassignmenttableRoadsurfacemodelLevel:OrganizationMAassignmenttableHigh polygon vehicle modelsOptimized polygon vehicle meta modelsLevel:Test ScenarioMAassignmenttableTestScenarioLevel:Sensor ModelMAassignmenttableOfflinepre-processingstageSimulatorEnvironmentOfflinepre-processedmaterial +emitter data-> optimized forsimulation engine<strong>Advanced</strong>Material + EmitterReference DBSimulationEnvironment VendorMaterial + Emitter DBOrganization-wide<strong>Advanced</strong> Material +Emitter DBOfflinepre-processedVehicle Models37 E. Roth, T. Calapoglu, et al.


<strong>Advanced</strong> Emitter Description• Simulating interference effects on sensors requires modelsof ego- and extraneous-emitters• Examples for emitters:• Vehicle headlights, traffic lights, street lamps• Emitters of active sensors (RADAR, Infrared light source, …)• Car2X-Transmitters• Related emission characteristics should be stored in physicallymeasurable SI units using Radiometry in order to not coverthe visible light spectrum only• The specific emitter characteristics should be stored in anorganization-wide database with a unique emitter IDUnique Emitter IDParent Emitter ID =Meta Data•Brief description•Class: ‚Vehicle Headlight„•Emission spectrum: min, max•State flags•Number of sub-emitters: 2•Sub-Emitter1•Position, Orientationrelative to Ref. Coord. Sys.•Number of state profiles = 3•State-Profile1•Emission map Format•Emission map data•State-Profile2•…•Sub-Emitter2OpenGL fallbackillumination model•Frustum data•Phong intensity data•Attenuation38E. Roth, T. Calapoglu, et al.

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!