Annotation¶
-
starfish.annotation.
generate_keypoints
(obj, num, stop=1, oversample=10, seed=0)¶ Generates evenly spaced 3D keypoints on the surface of an object.
This function implements the Sample Elimination algorithm from this paper to generate points on the surface of the object that follow a Poisson Disk distribution. The Poisson Disk distribution guarantees that no two points are within a certain distance of each other in 3D space, ensuring that the keypoints are more spread out.
The way Sample Elimination works is by first generating
num * oversample
points at random, and then eliminating points in a certain order until there arenum
left. Thus, a higher value ofoversample
will give more evenly spaced points.This also has the nice property that every intermediary set of points also follows a Poisson Disk distribution. By default, this function will keep running sample elimination until there is 1 point left, and then return the points in reverse order of elimination so that the first
n
points are also evenly spaced out for any1 <= n <= num
. The point at which Sample Elimination stops can be controlled with thestop
parameter.- Parameters
obj – (BlendDataObject): Blender object to operate on
num – (int): number of points to generate
stop – (int): an integer between 1 and
num
(inclusive) at which sample elimination will stop, default 1oversample – (float): amount of oversampling to do (see above), default 10
seed – (int): seed for the initial random point generation
- Returns
A list of length
num
containing 3-tuples representing the coordinates of the keypoints in object space. The firstn
elements of the list will also be evenly spaced out for anystop <= n <= num
.
-
starfish.annotation.
project_keypoints_onto_image
(keypoints, scene, obj, camera)¶ Converts 3D keypoints of an object into their corresponding 2D coordinates on the image.
This function takes a list of keypoints represented as 3D coordinates in object space, and then projects them onto the camera to get their corresponding 2D coordinates on the image. It uses the current location and orientation of the input Blender objects. Typical usage would be to call this function after
Frame.setup
and then store the 2D locations as metadata for that frame:frame.setup(scene, obj, camera, sun) frame.keypoints = project_keypoints_onto_image(keypoints, scene, obj, camera) with open('meta...', 'w') as f: f.write(frame.dumps())
- Parameters
keypoints – a list of 3D coordinates corresponding to the locations of the keypoints in the object space, e.g. the output of
generate_keypoints
scene – (BlendDataObject): the scene to use for aspect ratio calculations. Note that this should be the scene that you intend to perform the final render in, not necessarily the one that your objects exist in. If you render in a scene that has an output resolution with a different aspect ratio than the output resolution of this scene, then the results may be incorrect.
obj – (BlendDataObject): the object to use
camera – (BlendDataObject): the camera to use
- Returns
a list of (y, x) coordinates in the same order as
keypoints
where (0, 0) is the top left corner of the image and (1, 1) is the bottom right
-
starfish.annotation.
normalize_mask_colors
(mask, colors, color_variation_cutoff=6)¶ Normalizes the colors of a mask image.
Blender has a bug where the colors in a mask image vary slightly (e.g. instead of the background being solid rgb(0, 0, 0) black, it will actually be a random mix of rgb(0, 0, 1), rgb(1, 1, 0), etc…). This function takes a mask as well as a map of what the colors are supposed to be, then eliminates this variation.
This function accepts either the path to the mask (str) or the mask itself represented as a numpy array. If a path is provided, then the function will return the normalized mask as well as overwrite the original mask on disk. If a numpy array is provided, then the function will just return the normalized mask.
- Parameters
mask – path to mask image (str) or numpy array of mask image (RGB)
colors – a list of what the label colors are supposed to be, each in [R, G, B] format
color_variation_cutoff – colors will be allowed to differ from a color in the label map by a cityblock distance of no more than this value. The default value is 6, or equivalently 2 in each RGB channel. I chose this value because, in my experience with Blender 2.8, the color variation is no more than 1 in each channel, a number I then doubled to be safe.
- Returns
the normalized mask as a numpy array
-
starfish.annotation.
get_bounding_boxes_from_mask
(mask, label_map)¶ Gets bounding boxes from instance masks.
- Parameters
mask – path to mask image (str) or numpy array of mask image (RGB)
label_map – dictionary mapping classes (str) to their corresponding color(s). Each class can correspond to a single color (e.g.
{"cygnus": (0, 0, 206)}
) or multiple colors (e.g.{"cygnus": [(0, 0, 206), (206, 0, 0)]}
)
- Returns
a dictionary mapping classes (str) to their corresponding bboxes (a dictionary with the keys ‘xmin’, ‘xmax’, ‘ymin’, ‘ymax’). If a class does not appear in the image, then it will not appear in the keys of the returned dictionary.
-
starfish.annotation.
get_centroids_from_mask
(mask, label_map)¶ Gets centroids from instance masks.
- Parameters
mask – path to mask image (str) or numpy array of mask image (RGB)
label_map – dictionary mapping classes (str) to their corresponding color(s). Each class can correspond to a single color (e.g.
{"cygnus": (0, 0, 206)}
) or multiple colors (e.g.{"cygnus": [(0, 0, 206), (206, 0, 0)]}
)
- Returns
a dictionary mapping classes (str) to their corresponding centroids (y, x). If a class does not appear in the image, then it will not appear in the keys of the returned dictionary.