To identify a suitable compromise between resolution, accuracy, speed, and ease of use, the 3D linear Hall effect sensor ALS31300 (Allegro Microsystems, Manchester, USA) was used. The sensor is available in three different measuring ranges \((\pm 50\text\), \(\pm 100\text\), \(\pm 200\text\)) with the same footprint and software. This allows the appropriate chip to be selected for the desired measuring application without any other hardware changes. The variant ALS31300x-500, with a measuring range of \(\pm 50\text\), was used in the following configuration.
The system is divided into two subsystems: a sensor head (see Fig. 1a), on which eight Hall sensors are soldered, and a central control unit with a microcontroller (XIAO ESP32C3, Seeed Technology, Shenzen, China), which controls the Hall sensors, receives the measured values, processes them, and sends them to a computer through USB (schematic in Fig. 1b). The microcontroller software was developed using the Arduino framework (Arduino, www.arduino.cc, Monza, Italy). The sensor head is equipped with eight identical Hall sensors, arranged in a cuboid shape (\(\Delta x=20\text, \Delta y=10\text, \Delta z=10\text\)), which allows eight measuring points to be measured simultaneously at a single position, thus enabling the mapping of the entire field more quickly.
Fig. 1a Rendering of the Hall sensor head with 8 Sensors (H1–H8) distributed along 2 planes. b Schematic structure of the sensor concept, illustrating the sensor head and a microcontroller-based central control unit. c Rendering of the calibration setup, which consists of (i) a double cylinder made of PVC pipe with an inner radius of 151.6 mm, (ii) a two-layer cylindrical coil with 297 layers inside and 27 additional layers at each end, (iii) a positioning slide for the sensor and (iv) the Hall sensor head itself.
Each Hall sensor is independently controlled and measures the magnetic field in three orthogonal directions during a single measuring cycle. The integrated analog–digital converter (ADC) has an Effective Number of Bits \(\text=10\text\), with a sensitivity error of \(\pm 0.7\%\) (\(x\)/\(z\)-axes) and \(\pm 0.6\%\) (\(y\)-axis) over the entire range and a temperature dependence of \(0.12\%/\text\). Assuming a proper calibration of the sensor to exclude static errors and static magnetic fields, the resolution accuracy can be significantly enhanced through oversampling. In the developed system, oversampling by a factor of \(1000\) was selected, which, in the optimal case of a static error-free system, enables an improvement of the ENOB by \(5\text\) [18]. This results in an ADC-related resolution limit of \(3\mu \text\). \(1000\) measured values are read in \(6.5}\) and sent to the computer as average (a more detailed examination of the averaging and boundary conditions can be found in the Supplementary Information SI). To achieve the maximum sensing performance, a calibration procedure was implemented.
Calibration of the Hall sensor headA stable, spatially homogeneous, and precisely known magnetic field is required as a reference for calibrating the Hall sensors. Static electromagnets can be used, as the generated field can be characterized very accurately by Ampère’s law, if the geometry and electrical parameters are known precisely. An easy-to-replicate multi-layer solenoid was designed and used to calibrate the sensor head (see SI, chapter 4), whereby the size of the solenoid is much larger than a single Hall sensor itself. It can, therefore, be assumed that a homogeneous magnetic field is present in the measurement volume of the Hall sensor. The coil is shown in Fig. 1c with the Hall sensor head mounted in the isocenter.
Robot arm with 5 degrees of freedomMechanicsA robot arm with 5 DoF is used, with each rotational joint capable of movement up to \(180^\circ\) via an integrated servo motor. These robots are also known as manipulators, cobots or redundant robots [19]. In this paper, the open-source robot TinkerKit Braccio from Arduino was used, for which all CAD files are freely available. A rendering is shown in Fig. 2a. However, any robot arm with similar structure can be used, e.g., self-buildable in Ref. [20]. In this work, the commercial version is used and augmented with custom hardware. The \(220\text\)-long arm (white in Fig. 2a) ensures that the servo motor is positioned sufficiently far away from the sensor head, whereby the length was estimated using the leakage flux from the motor to the sensor (calculated well below Earth’s magnetic field).
Fig. 2a Rendering of the (i) robot with 5DoF, (ii) ArUco marker mechanically fixed to the robot, (iii) mounted Hall sensor and (iv) base angle sensor. b Schematic illustration of the coordinate systems of the Base (BCS), the Robot (RCS) and the FOV (FCS), the position of the reference board, the robot markers, the joint angles (α0 to α3) and the points P1 to P5 (used in the mathematical description) are presented. (c) Picture of the entire setup with (i) the camera, (ii) CPU of the Hall sensor, (iii) Arduino Uno and power electronics shield, (iv) reference board on the back, (v) reference marker at 60 mm towards the camera, (vi) origin of the BCS and (vii) the low field magnet. d Control and measurement concept.
The motor in the shoulder joint (setting \(_\)) is subjected to the greatest load due to the weight of the whole robot arm. Additional springs are mounted (see Fig. 2b) to reduce the torque requirement from this motor (e.g., \(-29.3\%\) torque reduction near the isocenter).
Two global coordinate systems are used: the robot coordinate system (RCS), which has its origin at the base of the robot and is used to describe the kinematics, and the base coordinate system (BCS), which has its origin at the reference board for motion tracking. Both coordinate systems are rotated to each other (see Fig. 2b). The following equation applies to the coordinate transformation for an exemplary matrix \(}\in }^\):
$$_}^ } = _}^ }_ _}^ } }_}^ }_ = \left( c} & 0 & 0 \\ 0 & 0 & 1 \\ 0 & 1 & 0 \\ \end } \right)$$
(2.1)
KinematicsThe kinematics of the robot (reduced to a 4DoF problem, with the last axis held constant) is determined by the rotation of the individual axes themselves. The use of homogeneous coordinates in \(}^\) simplifies the description of the kinematics and, in particular, for the evaluation of the motion tracking, and are, therefore, used in the following analysis (more details in the Appendix). In the kinematics of the robot, each axis is regarded as a separate origin of an individual rotated coordinate system. The origin describes the center of the axis of rotation and is numbered consecutively starting with \(}}_\) for the base, continuing with \(}}_\) for the next joint and so on. \(}}_\) describes the mounting point of the sensor head. Figure 2b illustrates the positions and connections. \(}}_\) can be determined by the following affine mapping[21]:
$$^}1}}}_}5}=^}1}}}_}2}^}2}}}_}3}^}3}}}_}4}^}4}}}_}5}$$
(2.2)
where each affine matrix \(^}}}}}_}}+1}\) consists of a composition of a rotation \(}}_}}\) and translation \(}}_}}\). Once the angles of the joints are known, the position of the sensor head can be calculated with relative ease. Inverse kinematics, however, is an underdetermined problem, as four joint angles must be calculated using only three coordinates [21]. Nonlinear least-squares optimization is used to solve the inverse kinematics [19]:
$$\tilde = \arg \mathop \limits_ \left( \left| _} - \bf(\varvec )} \right|} \right|_^ } \right)$$
(2.3)
As a boundary condition, the sensor head should be positioned as horizontally as possible within the FOV to avoid collisions with the magnet bore. Constraining this tilt angle reduces the 3D space, that can be reached by the robot arm, which may result in a discrepancy between the desired and reachable coordinate. A camera-based feedback of the reached position is implemented. The maximum permitted tilt angle (relative to the horizon) was empirically determined as \(10^\circ\) in the following experiments. To minimize vibrations of the sensor head, the trajectory through the FOV is planned such that the distance between two adjacent points is minimized. For these roboter types, it has been shown that the higher number of DoF than ultimately required cause difficulties when repeating a trajectory (called repeatability or cyclicity) [19]. An additional experiment was conducted to investigate this error by sequentially approaching the corner points of a cube with a side length of \(75}\) with the center at the isocenter and measuring the magnetic field. This experiment was repeated five time in series.
Control of the electronicsThe angles to be set are given by the control computer, on which the measurement routine is controlled in a MATLAB (The MathWorks Inc., Natick, USA) script. MATLAB was chosen because of the possible implementation of a feedback control system in the software MATLAB Simulink. A microcontroller board (Arduino Uno) receives the requested angle from the control computer through USB serial interface and translates them into electrical control signals for the servo motors. A power electronics stage between the microcontroller output and the servo motors optimally drives the motors and ensures electrical isolation. This setup is shown schematically in Fig. 2d. The Arduino is programmed in C++, using a self-developed library for the motor control, to enable adjustment of the acceleration of the servo motors. Four of the five servo motors were used to control the position of the sensor head during the measurement. The fifth servo motor enables the sensor head rotation and is set once at the beginning of the measurement and then held in this position. The robot can be powered by a power supply (5V/5A) or battery (e.g., through a voltage regulator from a 12 V lead-acid battery).
Motion trackingSetupThe combination of inexpensive servo motors and the simple design of the joints results in some mechanical tolerance due to the weight of the robot arm. Vision-based motion tracking is one possible solution to this problem for a variety of robotic systems [22,23,24] and known as vision-based robot control in case of direct feedback control [25]. The exact absolute position is determined using image recognition-based motion tracking utilizing the free Open Computer Vision (OpenCV) Library [26] (V4.5.5) in conjunction with ArUco (Augmented Reality University of Cordoba) markers [27].
The motion tracking is programmed in Python and the setup consists of three components: the reference board, the markers on the robot, and the camera (shown in Fig. 2c). The reference board comprises seven individual ArUco markers with a size of \(80\times80}\). Six markers are positioned on the back of the setup at known translations and rotations. An additional 7th marker is positioned on an additional support at a distance of \(60 \text\) towards the camera to increase the accuracy of the depth perception of the reference board. These seven markers define the origin of the BCS, which is located on the back of the setup. Four individual ArUco markers measuring \(30\times30 \text\) are attached to the segments of the robot so that the rotation of each joint can be determined by comparing two adjacent markers.
The selection of the used camera is a key factor for the quality of the motion tracking. A selection of tested cameras is listed in the appendix. In this setup, an older digital camera with optical zoom (HX60, Sony, Minato, Tokyo, Japan, f3.5) was chosen. Depending on the camera available, establishing a live connection to the evaluation script is not possible. Therefore, in this work, the video file is analyzed in post-processing. The camera is positioned approximately \(70 \text\) away from the reference board and records the movement at \(1920\times1080 \text\) (Full HD) with \(25 \text\). The setup needs to be calibrated to correct for the image distortions by the camera lens [28]. Calibration is performed using a checkerboard pattern with 13 horizontal and 9 vertical elements. \(300\) individual images are extracted from a recorded video (length \(30 \text\)) followed by the automated calculation of the calibration factors (script takes about \(1 }\) on a standard office computer). The calculated reprojection error is \(0.0685 \text\).
In addition to motion tracking, a self-developed sensor based on a sliding potentiometer has been installed in the system to detect the y-Euler angle (which is solely dependent on the base angle \(_\)). The slider of the potentiometer is moved via a lever mounted to the rotating robot base. A stabilized voltage drops over the potentiometer, allowing the position of the lever to be determined via a voltage measurement \(u(x)\). Due to the small angles, a small-angle approximation is employed, resulting in a linear equation (more details in chapter v of the appendix):
$$_=_\cdot u\left(x\right)+ _$$
(2.4)
with the two parameters \(_=7.6^\circ /V\) and\(_=-11.7^\circ\), which can be either determined in a calibration measurement, or calculated if the geometry (e.g. distance to the rotation axis) is known precisely. To minimize the tolerances of the sensor due to its mechanical play, the lever of the potentiometer was preloaded using springs.
Position determinationOpenCV already has implemented algorithms that can be used to identify ArUco markers within the video file and estimate their position [27]. For each identified marker, the translation and rotation relative to the camera position in the camera coordinate system (CCS, origin in the center of the camera image plane) is calculated.
During the mapping, the robot is given a four-second window to move to the new coordinate and swing out. Thereafter, the position of all markers is evaluated frame by frame, accumulated over 100 frames and then averaged. This averaging of positions serves to enhance the resilience of the system against noise or video errors. In each frame, the translation vector and rotation matrix of the BCS origin in the CCS are determined for all detected reference markers (see chapter vi of appendix for equations). The markers on the robot are detected frame by frame in the CCS according to the same principle and are arithmetically averaged over 100 frames. The robot markers are then coordinate transformed to the BCS, in which the position of the FOV and the RCS are defined by and known from the mechanical setup. To determine the translation and rotation of the sensor head relative to the FOV using the kinematics, it is necessary to determine the actual joint angles. \(_\) is known from the additional potentiometer-based sensor and the angles of the joints \(_\dots _\) are calculated from the corresponding adjacent markers pair.
Reference standard cubic robot using linear drivesTo validate the performance of the constructed robotic arm, a standard 3-axis robotic measurement system was replicated according to the design from Han et al.[16]. The robot is shown in Fig. 3 [29]. Due to a different arrangement of the linear drives, the measuring volume could be increased to a volume of \(663\times606\times687 \text (XYZ)\). A commercial Hall sensor (Series 9900 with a 3-axis Hall probe ZOA99-3208, F.W. BELL/MEGGIT, Christchurch, UK) was used, for which an absolute accuracy of \(\pm 0.57\%\) (at \(50 \text\), approximately \(285\mu T\)) and a relative accuracy of \(0.035\%\) (at \(50 \text\), approximately \(17.5 \mu T\)) can be achieved. The low updating rate of the sensor together with an averaging of ten measurements resulted in communication issues between the Hall sensor and the computer, necessitating the termination of the routine. Restarting the Hall sensor initiates a new zeroing, which is reflected in the measured values as a jump of the values. An analysis of the measurement points is only reasonably possible if the data are subsequently processed. For the correction, it is assumed that the same magnetic field strength should be measured at the same point in the FOV in two consecutive measurements. The so calculated offset affects the whole data set, which can yield in an additional error.
Fig. 3Image of the assembled robot with mounted ArUco markers for precision measurement. It should be noted that a left-handed coordinate system was used in the implementation of this robot. (i) Mounting point of the Hall sensor
The positioning accuracy of the robot was evaluated through motion tracking with OpenCV. Nine ArUco markers (side length \(80 \text\)) were affixed to the frame. Two markers (side length \(50 \text\)) were mounted on the moving part of the arm. The same camera was used as in the previous section.
Devices under testHalbach magnetA Halbach magnet, designed according to O’Reilly et al. [6], was mapped. 536 magnets (NdFeB, \(12\times12\times12 \text^3\) and \(1.395 \text\pm 25 \text\)) result in a magnet with length \(300 \text\) and a bore size of \(160\text\). For the purpose of shimming, up to 840 additional magnets (NdFeB, \(5\times5\times5 \text^3\), \(1.43T\)) can be positioned on the exterior of the Halbach magnet at a radius of \(166 \text\). To compute the additional shimming field, a genetic algorithm according to Refs. [6, 8] was used with considering only the \(_\) direction along the \(_\) main direction [8] (further information in the SI).
Using the presented measurement setup, a FOV of \(75\times75\times75\text^3\) \((XYZ)\) with a step size of \(5 \text\) was investigated, resulting in a total of 4096 measured points, which corresponds to 512 movements due to the mounted 8 Hall sensors. The entire measurement process took \(170 \text\) and was essentially fully autonomous. The FOV is scanned in \(yz\)-planes consecutively along the \(x\)-direction. As the \(x\) position increases, the force of the spring of the additional bracing, which is intended to mechanically relieve the shoulder motor, decreases. To enhance the precision of the system, the spring was retensioned halfway through the measurement manually. To simplify the shimming process, the measured volume was interpolated to a perfect \(5 \text\) grid.
For completeness, a shimming run was also performed using only the Hall sensor H7, because the eight individual Hall sensors still differ slightly in their calibrated sensitivity due to measurement tolerances during calibration. Thus, using only one sensor for shimming process can be considered as a cross-check. H7 was chosen because this position has the largest number of points around the center of the FOV due to the asymmetric distribution of the measurement points caused by the kinematics.
The same Halbach magnet was measured on a different day using the reference cubic robot and the commercial F.W. BELL Hall sensor, with the possibility of a small temperature-related deviation of \(}}_\). A FOV of \(100\times100\times100 \text^3 (XYZ)\) with a step size of \(5 \text\) was measured in a total of 9298 measuring points, which took approx. \(38 \text\)h and led to splitting the overall measurement into 11 sub-measurements.
X-gradientA \(x\)-gradient, designed for a spherical FOV of \(40 \text\) using Ref. [30], was mapped with both measuring systems, with the current switched on during the magnetic field measurement of each point (further information in the SI). The current was measured via a shunt (\(0.1\%\) tolerance, \(\pm 1 \text/K\), RUG-Z-R020-0.1-TK1, Isabellenhütte Heusler, Dillenburg, Germany) using a Kelvin measurement and a multimeter (GDM-8255A, GW-Instek, Taipeh City, Taiwan).
Using the presented measurement system, a FOV of \(100\times30\times40 \text^3 (XYZ)\) was mapped with a step size of \(10 \text\), resulting in a total of 288 points (36 movements due to the eightfold sensor head). The measurements were performed with a mean current of \(20.403A\). Due to ohmic losses, a convective cooling pause of about \(30 \text\) was inserted every 6 movements. The total measurement time was \(12 \text\) . The same measurement with the cubic robot took \(35 \text\) for 190 points in a spherical FOV with a diameter of \(60 \text\) at a current of \(14.5A\) with identical cooling pauses.
As a reference measurement, the gradient was characterized using a clinical 3 T MRI scanner (Prisma, Siemens Healthineers, Erlangen, Germany). For transmit, the body coil was used, while a Flex coil was wrapped around the gradient for receive. The gradient was loaded with a water phantom (0.9 l glass bottle) and it was powered with a current of \(0.4 \text.\) To obtain \(_\)-Fieldmaps, a 3D Gradient-Echo Imaging Sequence was used with the echo times \(_=1.9\left|5.16\right|8.42 \text\) and \(_=30 \text\). A flipangle of \(20^\circ\) and an overall isotropic resolution of \(2 \text\) was set. The homogeneity \(\Delta _\) was calculated from the phase evolution after the recorded phase was unwrapped [31]. By subtracting a baseline field map, the effect of the turned-on gradient on \(_\) itself can be determined.
Comments (0)