AirGen Data types
- class airgen.types.AngleLevelControllerGains(roll_gains=<airgen.types.PIDGains object>, pitch_gains=<airgen.types.PIDGains object>, yaw_gains=<airgen.types.PIDGains object>)
Bases:
object
Struct to contain controller gains used by angle rate PID controller
- roll_gains
kP, kI, kD for roll axis
- Type:
PIDGains
- pitch_gains
kP, kI, kD for pitch axis
- Type:
PIDGains
- yaw_gains
kP, kI, kD for yaw axis
- Type:
PIDGains
- to_lists()
- class airgen.types.AngleRateControllerGains(roll_gains=<airgen.types.PIDGains object>, pitch_gains=<airgen.types.PIDGains object>, yaw_gains=<airgen.types.PIDGains object>)
Bases:
object
Struct to contain controller gains used by angle level PID controller
- roll_gains
kP, kI, kD for roll axis
- Type:
PIDGains
- pitch_gains
kP, kI, kD for pitch axis
- Type:
PIDGains
- yaw_gains
kP, kI, kD for yaw axis
- Type:
PIDGains
- to_lists()
- class airgen.types.BarometerData
Bases:
MsgpackMixin
Barometer for multirotor and cars in AirGen.
default settings in settings.json:
{ "DefaultSensors": { "BarometerSensorName": { "SensorType": 1, "Enabled" : true, "PressureFactorSigma": 0.001825, "PressureFactorTau": 3600, "UncorrelatedNoiseSigma": 2.7, "UpdateLatency": 0, "UpdateFrequency": 50, "StartupDelay": 0 } }
usage:
>>> barometer_data=airgen_client.getBarometerData("BarometerSensorName")
- altitude: Quaternionr = <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
altitude of the barometer
- pressure: Vector3r = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
pressure of the barometer
- qnh: Vector3r = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
qnh of the barometer
- time_stamp: uint64 = 0
timestamp of the barometer data
- class airgen.types.Box2D
Bases:
MsgpackMixin
- max = <Vector2r> { 'x_val': 0.0, 'y_val': 0.0}
- min = <Vector2r> { 'x_val': 0.0, 'y_val': 0.0}
- class airgen.types.Box3D
Bases:
MsgpackMixin
- max = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- min = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- class airgen.types.CameraInfo
Bases:
MsgpackMixin
- fov = -1
- pose = <Pose> { 'orientation': <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}, 'position': <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}}
- proj_mat = <ProjectionMatrix> { }
- class airgen.types.CarControls(throttle=0, steering=0, brake=0, handbrake=False, is_manual_gear=False, manual_gear=0, gear_immediate=True)
Bases:
MsgpackMixin
- brake = 0.0
- gear_immediate = True
- handbrake = False
- is_manual_gear = False
- manual_gear = 0
- set_throttle(throttle_val, forward)
- steering = 0.0
- throttle = 0.0
- class airgen.types.CarState
Bases:
MsgpackMixin
- collision = <CollisionInfo> { }
- gear = 0
- handbrake = False
- kinematics_estimated = <KinematicsState> { }
- maxrpm = 0.0
- rpm = 0.0
- speed = 0.0
- timestamp = 0
- class airgen.types.CollisionInfo
Bases:
MsgpackMixin
- has_collided = False
- impact_point = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- normal = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- object_id = -1
- object_name = ''
- penetration_depth = 0.0
- position = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- time_stamp = 0.0
- class airgen.types.DetectionInfo
Bases:
MsgpackMixin
- box2D = <Box2D> { }
- box3D = <Box3D> { }
- geo_point = <GeoPoint> { 'altitude': 0.0, 'latitude': 0.0, 'longitude': 0.0}
- name = ''
- relative_pose = <Pose> { 'orientation': <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}, 'position': <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}}
- class airgen.types.DistanceSensorData
Bases:
MsgpackMixin
Distance sensor
default settings in settings.json:
{ "DefaultSensors": { "DistanceSensorName": { "SensorType": 5, "Enabled" : true, "MinDistance": 0.2, # Minimum distance measured by distance sensor (metres, only used to fill Mavlink message for PX4) (Default 0.2m) "MaxDistance": 40, Maximum distance measured by distance sensor (metres) (Default 40.0m) "X": 0, "Y": 0, "Z": -1, # Position of the sensor relative to the vehicle (in NED, in meters) (Default (0,0,0)-Multirotor, (0,0,-1)-Car) "Yaw": 0, "Pitch": 0, "Roll": 0, # Orientation of the sensor relative to the vehicle (degrees) (Default (0,0,0)) "ExternalController": false, # Whether the sensor is controlled by an external controller (Default false)Whether data is to be sent to external controller such as ArduPilot or PX4 if being used (default true) "DrawDebugPoints": false } } }
Note
For Cars, the sensor is placed 1 meter above the vehicle center by default. This is required since otherwise the sensor gives strange data due it being inside the vehicle. This doesn't affect the sensor values say when measuring the distance between 2 cars.
usage:
>>> distance_sensor_data=airgen_client.getDistanceSensorData("DistanceSensorName")
- distance: float = 0.0
distance measured by the distance sensor
- max_distance: float = 0.0
maximum distance measured by the distance sensor
- min_distance: float = 0.0
minimum distance measured by the distance sensor
- relative_pose: Pose = <Pose> { 'orientation': <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}, 'position': <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}}
relative pose of the distance sensor
- time_stamp = 0
timestamp of the distance sensor data
- class airgen.types.DrivetrainType
Bases:
object
- ForwardOnly = 1
- MaxDegreeOfFreedom = 0
- Orient = 2
- class airgen.types.EnvironmentState
Bases:
MsgpackMixin
- air_density = 0.0
- air_pressure = 0.0
- geo_point = <GeoPoint> { 'altitude': 0.0, 'latitude': 0.0, 'longitude': 0.0}
- gravity = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- position = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- temperature = 0.0
- class airgen.types.GeoPoint(latitude=0.0, longitude=0.0, altitude=0.0)
Bases:
MsgpackMixin
- altitude = 0.0
- latitude = 0.0
- longitude = 0.0
- class airgen.types.GeoPose(geopoint_val=None, orientation_val=None)
Bases:
MsgpackMixin
- Parameters:
geopoint_val (GeoPoint)
orientation_val (Quaternionr)
- geopoint = <GeoPoint> { 'altitude': 0.0, 'latitude': 0.0, 'longitude': 0.0}
- orientation = <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- class airgen.types.GnssFixType
Bases:
MsgpackMixin
Represents the GNSS (Global Navigation Satellite System) fix types in AirGen.
This class categorizes the various types of GNSS fixes based on the quality and source of satellite data.
- GNSS_FIX_2D_FIX = 2
- GNSS_FIX_3D_FIX = 3
- GNSS_FIX_NO_FIX = 0
- GNSS_FIX_TIME_ONLY = 1
- class airgen.types.GnssReport
Bases:
MsgpackMixin
Represents a GNSS (Global Navigation Satellite System) report in AirSim.
This class provides detailed information about the current GNSS status and readings.
- eph = 0.0
- epv = 0.0
- fix_type = <GnssFixType> { }
- geo_point = <GeoPoint> { 'altitude': 0.0, 'latitude': 0.0, 'longitude': 0.0}
- time_utc = 0
- velocity = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- class airgen.types.GpsData
Bases:
MsgpackMixin
GPS sensor
default settings in settings.json:
{ "DefaultSensors": { "GPSSensorName": { "SensorType": 3, "Enabled" : true, "EphTimeConstant": 0.9, "EpvTimeConstant": 0.9, "EphInitial": 25, "EpvInitial": 25, "EphFinal": 0.1, "EpvFinal": 0.1, "EphMin3d": 3, "EphMin2d": 4, "UpdateLatency": 0.2, "UpdateFrequency": 50, "StartupDelay": 1 } } }
usage:
>>> gps_data=airgen_client.getGpsData("GPSSensorName")
- gnss = <GnssReport> { }
gnss report of the gps
- is_valid = False
whether the gps data is valid
- time_stamp = 0
timestamp of the gps data
- class airgen.types.ImageRequest(camera_name, image_type, pixels_as_float=False, compress=True)
Bases:
MsgpackMixin
A class to facilitate image capture requests in AirGen.
- Parameters:
camera_name (str)
image_type (ImageType)
pixels_as_float (bool)
compress (bool)
- camera_name: str = 'front_center'
description: name of the camera.
For car, the default camera names are:
front_center
,front_right
,front_left
,back_center
,fpv
,back_center
.fpv
camera is driver's head position in the car.For multirotor, the default camera names are:
front_center
,front_right
,front_left
,bottom_center
.In computer vision mode, camera names are same as in multirotor.
Apart from these, you can add more cameras to the vehicles and external cameras which are not attached to any vehicle through the settings.json file.
- compress: bool = False
if true, image is compressed using jpg, else uncompressed png. Cannot be true when pixels_as_float is true. Check ImageType for the proper parameters of each image type
- image_type: ImageType = 0
- pixels_as_float: bool = False
if true, image pixels are returned as float, else as uint8. Check ImageType for the proper parameters of each image type
- class airgen.types.ImageResponse
Bases:
MsgpackMixin
A class to handle and represent image responses from AirGen.
- camera_fov: float = 0.0
- camera_orientation: Quaternionr = <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- camera_position: Vector3r = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- compress: float = True
- height: int = 0
- image_data_float: List[float] = []
description: float image data stored in raw bytes. Empty if pixels_as_float is false.
usage:
>>> img1d = np.asarray(response.image_data_float, dtype=np.float32) >>> img = img1d.reshape(response.height, response.width, num_channels)
- image_data_uint8: bytes = b''
description: uint8 image data stored in raw bytes. Empty if pixels_as_float is true.
usage:
>>> img1d = np.frombuffer(response.image_data_uint8, dtype=np.uint8) # get 1d numpy array >>> img_rgb = img1d.reshape(response.height, response.width, 3) # reshape to 3 channel image array H X W X 3
- image_type: ImageType = 0
- message: str = ''
- pixels_as_float: bool = False
- time_stamp = 0
- width: int = 0
- class airgen.types.ImageType
Bases:
object
Image types supported by AirGen.
Default settings of camera type in settings.json
{ "CameraDefaults": { "CaptureSettings": [ "ImageType": 0, "Width": 256, "Height": 256, "FOV_Degrees": 90, "AutoExposureSpeed": 100.0, "AutoExposureBias": 0, "MotionBlurAmount": 0.0, "MotionBlurMaxDistortion": 0.0, "TargetGamma": 1.0 } ] } }
example code: .. todo:: add link to example script for cameras
- DepthPerspective = 2
decription: distance between object and camera's focal point. This is suitable for point cloud reconstruction.
- pertinent parameters in ImageRequest:
pixels_as_float: True
compress: False
image data stored as numpy array with dtype:np.float32 and shape:[height, width, 1]
- DepthPlanar = 1
description: distance between object and camera's imaging plane, i.e., all points that are plane-parallel to the camera have same depth
- pertinent parameters in ImageRequest:
pixels_as_float: True
compress: False
image data stored as numpy array with dtype:np.float32 and shape:[height, width, 1]
- DepthVis = 3
description: encodes depth info into RGB channels for visualization. Each pixel value is interpolated from black to white depending on depth in camera plane in meters. The pixels with pure white means depth of 100m or more while pure black means depth of 0 meters.
- pertinent parameters in ImageRequest:
pixels_as_float: False
image stored as numpy array with dtype:uint8 and shape: [height, width, 3]
- DisparityNormalized = 4
description: normalized disparity, each pixel is (Xl - Xr)/Xmax, which is thereby normalized to values between 0 to 1.
- pertinent parameters in ImageRequest:
pixels_as_float: True
compress: False
image data stored as numpy array with dtype=np.float32 and shape:[height, width, 1]
- Infrared = 7
description: Currently this is just a map from object's SegmentationID to grey scale 0-255. So any mesh with object ID 42 shows up with color (42, 42, 42).
- pertinent parameters in ImageRequest:
pixels_as_float: False
image data stored as numpy array with dtype:uint8 and shape:[height, width, 3]
- OpticalFlow = 8
description: optical flow, information about motion perceived by the point of view of the camera of the scene. OpticalFlow returns a 2-channel image where the channels correspond to vx and vy respectively.
- pertinent parameters in ImageRequest:
pixels_as_float: True
compress: False
image data stored as numpy array with dtype:np.float32 and shape:[height, width, 2]
- OpticalFlowVis = 9
description: encodes optical flow into RGB channels for visualization
- pertinent parameters in ImageRequest:
pixels_as_float: False
image data stored as numpy array with dtype:uint8 and shape:[height, width, 3]
- Scene = 0
description: RGB image
- pertinent parameters in ImageRequest:
pixels_as_float: False
image data stored as numpy array with dtype:uint8 array with shape:[height, width, 3]
- Segmentation = 5
descritpion: RGB image where each pixel encodes a unique Segmentation ID of object. It is recommended that you request uncompressed image using this API
- pertinent parameters in ImageRequest:
pixels_as_float: False
compress: False
image data stored as numpy array with dtype:uint8 and shape:[height, width, 3]
- SurfaceNormals = 6
description: surface normals of objects in the scene encoded into RGB channels. To get unit vector of the normal, use the following formula: .. math:
x = (r / 255)*2.0 - 1.0 y = (g / 255)*2.0 - 1.0 z = (b / 255)*2.0 - 1.0
- pertinent parameters in ImageRequest:
pixels_as_float: False
image data stored as numpy array with dtype:uint8 and shape:[height, width, 3]
- static to_str(image_type)
convert image type to string
- Parameters:
image_type (int) -- _description_
- Returns:
_description_
- Return type:
str
- class airgen.types.ImuData
Bases:
MsgpackMixin
Imu for multirotor and cars in AirGen.
default settings in settings.json:
{ "DefaultSensors": { "IMUSensorName": { "SensorType": 2, "Enabled" : true, "AngularRandomWalk": 0.3, # TODO: ADD explanation "GyroBiasStabilityTau": 500, "GyroBiasStability": 4.6, "VelocityRandomWalk": 0.24, "AccelBiasStabilityTau": 800, "AccelBiasStability": 36 } } }
usage:
>>> imu_data=airgen_client.getImuData("IMUSensorName")
- angular_velocity: Vector3r = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
angular velocity of the imu
- linear_acceleration: Vector3r = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
linear acceleration of the imu
- orientation: Quaternionr = <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
orientation of the imu
- time_stamp = 0
timestamp of the imu data
- class airgen.types.KinematicsState
Bases:
MsgpackMixin
- angular_acceleration = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- angular_velocity = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- linear_acceleration = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- linear_velocity = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- orientation = <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- position = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- class airgen.types.LandedState
Bases:
object
- Flying = 1
- Landed = 0
- class airgen.types.LidarData
Bases:
MsgpackMixin
Lidar for multirotor and cars in AirGen. Multiple Lidar with different settings can be installed on a vehicle with different Lidar sensor names.
default settings in settings.json:
{ "DefaultSensors": { "LidarSensorName": { "SensorType": 6, "Enabled" : true, "NumberOfChannels": 16, # Number of channels/lasers of the lidar "Range": 10000, # Max distance of the lidar in meters "RotationsPerSecond": 10, # Number of times the lidar rotates per second "PointsPerSecond": 100000, # Number of points captured per second "X": 0, "Y": 0, "Z": -1, # Position of the lidar relative to the vehicle (in NED, in meters) "Roll": 0, "Pitch": 0, "Yaw" : 0, # Orientation of the lidar relative to the vehicle (in degrees, yaw-pitch-roll order to front vector +X) "VerticalFOVUpper": -15, # Vertical FOV upper limit for the lidar, in degrees (multirotor -15, car +10) "VerticalFOVLower": -45, # Vertical FOV lower limit for the lidar, in degrees (multirotor -45, car -10) "HorizontalFOVStart": 0, # Horizontal FOV start for the lidar, in degrees "HorizontalFOVEnd": 359, #Horizontal FOV end for the lidar, in degrees "DrawDebugPoints": true, "DataFrame": "SensorLocalFrame" # Frame for the points in output ("VehicleInertialFrame" or "SensorLocalFrame") } } }
usage:
>>> lidar_data=airgen_client.getLidarData("LidarSensorName")
- point_cloud: List[float] = []
point cloud data represented by a list of 3d coordinates ([x0,y0,z0,x1,y1,z1...]), len(point_cloud) = 3 * NumberOfPoints
- pose: Pose = <Pose> { 'orientation': <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}, 'position': <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}}
pose of the lidar sensor
- segmentation: List[int] = []
Segmentation Id of each point in the point cloud, len(segmentation) = NumberOfPoints
- time_stamp = 0
timestamp of the lidar data
- class airgen.types.MagnetometerData
Bases:
MsgpackMixin
Magnetometer for multirotor and cars in AirGen.
default settings in settings.json:
{ "DefaultSensors": { "MagnetometerSensorName": { "SensorType": 3, # TODO: ADD explanation"SensorType": 4, "Enabled" : true, "NoiseSigma": 0.005, "ScaleFactor": 1, "NoiseBias": 0, "UpdateLatency": 0, "UpdateFrequency": 50, "StartupDelay": 0 } } }
usage:
>>> magnetometer_data=airgen_client.getMagnetometerData("MagnetometerSensorName")
- magnetic_field_body: Vector3r = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
timestamp of the magnetometer data
- magnetic_field_covariance: float = 0.0
magnetic field covariance of the magnetometer
- time_stamp = 0
timestamp of the magnetometer data
- class airgen.types.MeshPositionVertexBuffersResponse
Bases:
MsgpackMixin
- indices = 0.0
- name = ''
- orientation = <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- position = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- vertices = 0.0
- class airgen.types.MsgpackMixin
Bases:
object
- classmethod from_msgpack(encoded)
- to_msgpack(*args, **kwargs)
- class airgen.types.MultirotorState
Bases:
MsgpackMixin
- can_arm = False
- collision = <CollisionInfo> { }
- gps_location = <GeoPoint> { 'altitude': 0.0, 'latitude': 0.0, 'longitude': 0.0}
- kinematics_estimated = <KinematicsState> { }
- landed_state = 0
- rc_data = <RCData> { 'is_initialized': False, 'is_valid': False, 'pitch': 0.0, 'roll': 0.0, 'switch1': 0, 'switch2': 0, 'switch3': 0, 'switch4': 0, 'switch5': 0, 'switch6': 0, 'switch7': 0, 'switch8': 0, 'throttle': 0.0, 'timestamp': 0, 'yaw': 0.0}
- ready = False
- ready_message = ''
- timestamp = 0
- class airgen.types.PIDGains(kp, ki, kd)
Bases:
object
Struct to store values of PID gains. Used to transmit controller gain values while instantiating AngleLevel/AngleRate/Velocity/PositionControllerGains objects.
- kP
Proportional gain
- Type:
float
- kI
Integrator gain
- Type:
float
- kD
Derivative gain
- Type:
float
- to_list()
- class airgen.types.Pose(position_val=None, orientation_val=None)
Bases:
MsgpackMixin
- containsNan()
- static nanPose()
- orientation = <Quaternionr> { 'w_val': 1.0, 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- position = <Vector3r> { 'x_val': 0.0, 'y_val': 0.0, 'z_val': 0.0}
- class airgen.types.PositionControllerGains(x_gains=<airgen.types.PIDGains object>, y_gains=<airgen.types.PIDGains object>, z_gains=<airgen.types.PIDGains object>)
Bases:
object
Struct to contain controller gains used by position PID controller
- x_gains
kP, kI, kD for X axis
- Type:
PIDGains
- y_gains
kP, kI, kD for Y axis
- Type:
PIDGains
- z_gains
kP, kI, kD for Z axis
- Type:
PIDGains
- to_lists()
- class airgen.types.ProjectionMatrix
Bases:
MsgpackMixin
- matrix = []
- class airgen.types.Quaternionr(w_val=1.0, x_val=0.0, y_val=0.0, z_val=0.0)
Bases:
MsgpackMixin
Airgen's representation of quaternion in wxyz format.
- conjugate()
- containsNan()
- cross(other)
- dot(other)
- classmethod from_axis_angle(angle)
create a quaternion from axis and angle
- Parameters:
axis (Vector3r) -- axis of rotation
angle (float) -- angle of rotation
- Returns:
quaternion representing the rotation
- Return type:
Quaternionr
- classmethod from_euler_angles(pitch, roll, yaw)
transform from euler angles to quaternion
- Parameters:
pitch (float) -- pitch in radians. Positive pitch means tilt upward
roll (float) -- roll in radians
yaw (float) -- roll in radians
- Returns:
quaternion representation of the euler angles
- Return type:
Quaternior
- get_length()
- inverse()
- static nanQuaternionr()
- outer_product(other)
- rotate(other)
- sgn()
- slerp(other, fraction)
- star()
- to_numpy_array()
- w_val = 0.0
- property wxyz: Tuple[float, float, float, float]
return the quaternion as a tuple (w, x, y, z)
- Returns:
(w, x, y, z)
- Return type:
Tuple[float, float, float, float]
- x_val = 0.0
- property xyzw: Tuple[float, float, float, float]
return the quaternion as a tuple (x, y, z, w)
- Returns:
(x, y, z, w)
- Return type:
Tuple[float, float, float, float]
- y_val = 0.0
- z_val = 0.0
- class airgen.types.RCData(timestamp=0, pitch=0.0, roll=0.0, throttle=0.0, yaw=0.0, switch1=0, switch2=0, switch3=0, switch4=0, switch5=0, switch6=0, switch7=0, switch8=0, is_initialized=False, is_valid=False)
Bases:
MsgpackMixin
- is_initialized = False
- is_valid = False
- pitch = 0.0
- roll = 0.0
- switch1 = 0
- switch2 = 0
- switch3 = 0
- switch4 = 0
- switch5 = 0
- switch6 = 0
- switch7 = 0
- switch8 = 0
- throttle = 0.0
- timestamp = 0
- yaw = 0.0
- class airgen.types.RotorStates
Bases:
MsgpackMixin
- rotors = []
- timestamp = 0
- class airgen.types.Vector2r(x_val=0.0, y_val=0.0)
Bases:
MsgpackMixin
Airgen Vector2r class that represents 2d vector, mainly used for parameters in simAPI calls.
- x_val = 0.0
- y_val = 0.0
- class airgen.types.Vector3r(x_val=0.0, y_val=0.0, z_val=0.0)
Bases:
MsgpackMixin
Airgen Vector3r class that represents 3d vector, mainly used for parameters in simAPI calls.
- containsNan()
- cross(other)
- distance_to(other)
- dot(other)
- get_length()
- static nanVector3r()
- to_numpy_array()
- x_val = 0.0
- y_val = 0.0
- z_val = 0.0
- class airgen.types.VelocityControllerGains(x_gains=<airgen.types.PIDGains object>, y_gains=<airgen.types.PIDGains object>, z_gains=<airgen.types.PIDGains object>)
Bases:
object
Struct to contain controller gains used by velocity PID controller
- x_gains
kP, kI, kD for X axis
- Type:
PIDGains
- y_gains
kP, kI, kD for Y axis
- Type:
PIDGains
- z_gains
kP, kI, kD for Z axis
- Type:
PIDGains
- to_lists()
- class airgen.types.WeatherParameter
Bases:
object
- Dust = 6
- Enabled = 8
- Fog = 7
- MapleLeaf = 4
- Rain = 0
- RoadLeaf = 5
- RoadSnow = 3
- Roadwetness = 1
- Snow = 2
- class airgen.types.YawMode(is_rate=True, yaw_or_rate=0.0)
Bases:
MsgpackMixin
- is_rate = True
- yaw_or_rate = 0.0