MotionCaptureJointCalibration.jl

Kinematic calibration for robots using motion capture data
Popularity
17 Stars
Updated Last
11 Months Ago
Started In
July 2017

MotionCaptureJointCalibration

Build Status codecov.io

MotionCaptureJointCalibration provides functionality for kinematic calibration of robots, given measurements of the positions of motion capture markers attached to the robot's links and positions of the robot's joints in a number of poses. It does so by solving a nonlinear program (NLP) with (weighted) square error between measured and predicted marker locations as the objective to minimize.

MotionCaptureJointCalibration is a small Julia library built on top of JuMP and RigidBodyDynamics.jl. JuMP makes it possible to choose between various NLP solvers. Ipopt appears to perform fairly well for the problems formulated by this package.

News

  • October 18, 2017: tagged version 0.0.1.
  • August 4, 2017: the package is under initial construction.

Features

Features include:

  • handling of occlusions
  • handling of measurements of the body-fixed locations of only a subset of the markers attached to the robot (the unknown marker positions will be solved for, given rough bounds)
  • handling of measurements of only a subset of a robot's joint positions (the unknown joint positions will be solved for, given rough bounds)
  • proper handling of quaternion-parameterized floating joints (unit norm constraints for quaternions)
  • visualization of calibration results using RigidBodyTreeInspector

Currently, MotionCaptureJointCalibration can only estimate constant offsets between measured and actual joint positions.

Installation

To install, simply run

Pkg.add("MotionCaptureJointCalibration")

This will install MotionCaptureJointCalibration and its required dependencies. RigidBodyTreeInspector.jl is an optional dependency and can be used to visualize the calibration results (Pkg.add("RigidBodyTreeInspector")). You'll also need an NLP solver that interfaces with JuMP, e.g. Ipopt (Pkg.add("Ipopt")).

Usage

See the demo notebook for usage.

Acknowledgements

A variant of the NLP formulation used in this package is due to Michael Posa.