Skip to content

Virtual Chinrest Task

Try the Demo

In this task, participants determine their viewing distance from the screen using a blind spot detection method. A red circle moves laterally across the screen while participants fixate on a black square. When the circle enters their blind spot and disappears from view, participants press the spacebar. Based on the distance between the fixation point and the blind spot location, the task estimates viewing distance.

The blind spot is a natural feature of human vision located approximately 13-15 degrees temporal to the fovea, where the optic nerve exits the retina. By leveraging this anatomical constant, the virtual chinrest provides a convenient way to calibrate viewing distance in online experiments without specialized equipment1. This allows researchers to present stimuli in calibrated physical units (e.g., degrees of visual angle) even in remote testing scenarios.

Quick Start

  1. Create or open an experiment from your Dashboard
  2. Click Add task and select "Virtual Chinrest"
  3. Optionally adjust the number of trials for more precise measurement
  4. Preview to ensure the task runs smoothly
  5. Use the measured viewing distance to calibrate subsequent tasks

New to Meadows? See the Getting Started guide for a complete walkthrough.

Alternative tasks

Parameters

Customize the task by changing these on the Parameters tab of the task.

General Interface settings

Customize the instruction at the top of the page, as well as toolbar buttons. These apply to most task types on Meadows.

Instruction hint

Text that you can display during the task at the top of the page.

Extended instruction

A longer instruction that only appears if the participant hovers their mouse cursor over the hint.

Hint size

Whether to display the instruction, or hide it, and what font size to use.

Fullscreen button

Whether to display a button in the bottom toolbar that participants can use to switch fullscreen mode on and off.

Calibration Settings

Trials (Number of trials)

Number of measurement trials to perform. More trials provide a more accurate estimate by averaging multiple measurements. Default: 5. Valid range: 1 to 100.

Object size (pixels)

Size of both the fixation square and the moving target circle in pixels. Larger objects may be easier to track. Default: 30. Valid range: 1 to 200.

Blindspot Angle (degrees)

The assumed angular location of the blind spot relative to the fixation point, in degrees of visual angle. The typical human blind spot is located approximately 13-15 degrees temporal to the fovea. Default: 13.5. Valid range: 1 to 50.

Debug mode

When enabled, prints detailed information to the browser console (accessible via right-click → Inspect). Useful for troubleshooting but should be turned off for data collection with external participants. Default: unchecked.

How it works

The task uses the following procedure:

  1. Participants fixate on a black square positioned on the right side of their visual field
  2. A red circle starts moving from left to right
  3. As the circle approaches the participant's blind spot, it will appear to disappear
  4. Participants press the spacebar when they notice the circle has disappeared
  5. The task records the position where the response occurred
  6. This process repeats for the specified number of trials
  7. The viewing distance is calculated using trigonometry based on the average measured blind spot location

The calculation uses the formula: viewing_distance = blind_spot_distance / tan(blind_spot_angle), where the blind spot distance is measured on screen and converted from pixels to millimeters using the card width calibration (85.6 mm standard credit card width).

Note

This task does not currently handle window resizing during execution. If you need this functionality, please contact the Meadows team.

Data

For general information about the various structures and file formats that you can download for your data see Downloads.

Trial-wise "annotations" (table rows), with one row per measurement trial plus a final row with the estimated viewing distance. Columns:

  • trial - numerical index of the measurement trial (0-indexed)
  • time_trial_start - timestamp when the moving circle animation began (seconds since 1/1/1970)
  • time_trial_response - timestamp when the participant pressed spacebar (seconds since 1/1/1970)
  • label - encoded measurement information:
    • For measurement trials: targetDistPx_{distance} where distance is the measured blind spot distance in pixels
    • For the final trial: viewDistMm_{distance} where distance is the estimated viewing distance in millimeters

Analysis

Extract Viewing Distance

The primary output of this task is the estimated viewing distance in millimeters, which can be used to calibrate subsequent tasks in your experiment.

import pandas as pd
import numpy as np

# Load the annotations data
df = pd.read_csv('Meadows_myExperiment_v1_annotations.csv')

# Extract viewing distance from the final trial
final_trial = df[df['label'].str.startswith('viewDistMm_')]
if not final_trial.empty:
    viewing_distance = float(final_trial.iloc[0]['label'].split('_')[1])
    print(f"Estimated viewing distance: {viewing_distance:.1f} mm ({viewing_distance/10:.1f} cm)")

# Extract individual trial measurements
trial_data = df[df['label'].str.startswith('targetDistPx_')]
if not trial_data.empty:
    trial_data['blind_spot_distance_px'] = trial_data['label'].str.split('_').str[1].astype(float)
    print(f"\nMeasurement consistency:")
    print(f"Mean: {trial_data['blind_spot_distance_px'].mean():.2f} px")
    print(f"Std Dev: {trial_data['blind_spot_distance_px'].std():.2f} px")
    print(f"Range: {trial_data['blind_spot_distance_px'].min():.2f} - {trial_data['blind_spot_distance_px'].max():.2f} px")
library(tidyverse)

# Load the annotations data
df <- read_csv('Meadows_myExperiment_v1_annotations.csv')

# Extract viewing distance from the final trial
viewing_distance_row <- df %>%
  filter(str_starts(label, 'viewDistMm_')) %>%
  slice(1)

if (nrow(viewing_distance_row) > 0) {
  viewing_distance <- as.numeric(str_split(viewing_distance_row$label, '_')[[1]][2])
  cat(sprintf("Estimated viewing distance: %.1f mm (%.1f cm)\n", 
              viewing_distance, viewing_distance/10))
}

# Extract individual trial measurements
trial_data <- df %>%
  filter(str_starts(label, 'targetDistPx_')) %>%
  mutate(
    blind_spot_distance_px = as.numeric(str_split(label, '_', simplify = TRUE)[, 2])
  )

if (nrow(trial_data) > 0) {
  cat("\nMeasurement consistency:\n")
  trial_data %>%
    summarise(
      mean = mean(blind_spot_distance_px),
      sd = sd(blind_spot_distance_px),
      min = min(blind_spot_distance_px),
      max = max(blind_spot_distance_px)
    ) %>%
    print()
}

To extract viewing distance in Excel or Google Sheets:

  1. Open the annotations.csv file
  2. Filter the label column for rows starting with viewDistMm_
  3. The viewing distance in millimeters is the number after the underscore
  4. To extract it automatically, use: =VALUE(RIGHT(A2, LEN(A2)-FIND("_",A2))) (assuming the label is in cell A2)
  5. Convert to centimeters by dividing by 10

To analyze measurement consistency:

  1. Filter for rows starting with targetDistPx_
  2. Extract the pixel distances using the same formula approach
  3. Calculate mean and standard deviation to assess measurement reliability

Use Viewing Distance for Visual Angle Calculations

Once you have the viewing distance, you can calculate the appropriate stimulus size in pixels for a desired visual angle, or convert measured pixel sizes to visual angles.

import numpy as np

# Viewing distance in mm (from previous analysis)
viewing_distance_mm = 600  # example value

# Screen resolution (replace with actual values)
pixels_per_mm = 3.78  # from card measure or screen specs

def visual_angle_to_pixels(angle_deg, distance_mm, px_per_mm):
    """Convert visual angle in degrees to pixels"""
    size_mm = 2 * distance_mm * np.tan(np.radians(angle_deg / 2))
    return size_mm * px_per_mm

def pixels_to_visual_angle(pixels, distance_mm, px_per_mm):
    """Convert pixels to visual angle in degrees"""
    size_mm = pixels / px_per_mm
    angle_rad = 2 * np.arctan(size_mm / (2 * distance_mm))
    return np.degrees(angle_rad)

# Example: stimulus should be 2 degrees of visual angle
desired_angle = 2.0
stimulus_size_px = visual_angle_to_pixels(desired_angle, viewing_distance_mm, pixels_per_mm)
print(f"For {desired_angle}° visual angle at {viewing_distance_mm}mm: {stimulus_size_px:.1f} pixels")
# Viewing distance in mm (from previous analysis)
viewing_distance_mm <- 600  # example value

# Screen resolution (replace with actual values)
pixels_per_mm <- 3.78  # from card measure or screen specs

# Convert visual angle in degrees to pixels
visual_angle_to_pixels <- function(angle_deg, distance_mm, px_per_mm) {
  size_mm <- 2 * distance_mm * tan(angle_deg * pi / 360)
  return(size_mm * px_per_mm)
}

# Convert pixels to visual angle in degrees
pixels_to_visual_angle <- function(pixels, distance_mm, px_per_mm) {
  size_mm <- pixels / px_per_mm
  angle_rad <- 2 * atan(size_mm / (2 * distance_mm))
  return(angle_rad * 180 / pi)
}

# Example: stimulus should be 2 degrees of visual angle
desired_angle <- 2.0
stimulus_size_px <- visual_angle_to_pixels(desired_angle, viewing_distance_mm, pixels_per_mm)
cat(sprintf("For %.1f° visual angle at %dmm: %.1f pixels\n", 
            desired_angle, viewing_distance_mm, stimulus_size_px))
% Viewing distance in mm (from previous analysis)
viewing_distance_mm = 600;  % example value

% Screen resolution (replace with actual values)
pixels_per_mm = 3.78;  % from card measure or screen specs

% Convert visual angle in degrees to pixels
function pixels = visual_angle_to_pixels(angle_deg, distance_mm, px_per_mm)
    size_mm = 2 * distance_mm * tand(angle_deg / 2);
    pixels = size_mm * px_per_mm;
end

% Convert pixels to visual angle in degrees
function angle_deg = pixels_to_visual_angle(pixels, distance_mm, px_per_mm)
    size_mm = pixels / px_per_mm;
    angle_deg = 2 * atand(size_mm / (2 * distance_mm));
end

% Example: stimulus should be 2 degrees of visual angle
desired_angle = 2.0;
stimulus_size_px = visual_angle_to_pixels(desired_angle, viewing_distance_mm, pixels_per_mm);
fprintf('For %.1f° visual angle at %dmm: %.1f pixels\n', ...
        desired_angle, viewing_distance_mm, stimulus_size_px);

References


  1. Li, Q., Joo, S. J., Yeatman, J. D., & Reinecke, K. (2020). Controlling for Participants' Viewing Distance in Large-Scale, Psychophysical Online Experiments Using a Virtual Chinrest. Scientific Reports, 10(1), 904. doi:10.1038/s41598-020-57853-w