https://github.com/fwilliams/point-cloud-utils

An easy-to-use Python library for processing and manipulating 3D point clouds and meshes.

https://github.com/fwilliams/point-cloud-utils

Science Score: 46.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
    Links to: arxiv.org
  • Committers with academic emails
    1 of 14 committers (7.1%) from academic institutions
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (11.7%) to scientific vocabulary

Keywords

chamfer-distance geometry hausdorff hausdorff-distance hausdorff-measure lloyd-relaxation mesh nanoflann nearest-neighbor nearest-neighbors numpy optimal-transport point-cloud poisson poisson-disc-sampling poisson-disk-sampling python sampling sinkhorn sinkhorn-distance

Keywords from Contributors

transformers interactive optimism sequencers embedded projects scheduling hacking multi-modality chart
Last synced: 6 months ago · JSON representation

Repository

An easy-to-use Python library for processing and manipulating 3D point clouds and meshes.

Basic Info
Statistics
  • Stars: 1,432
  • Watchers: 19
  • Forks: 114
  • Open Issues: 33
  • Releases: 78
Topics
chamfer-distance geometry hausdorff hausdorff-distance hausdorff-measure lloyd-relaxation mesh nanoflann nearest-neighbor nearest-neighbors numpy optimal-transport point-cloud poisson poisson-disc-sampling poisson-disk-sampling python sampling sinkhorn sinkhorn-distance
Created over 7 years ago · Last pushed 10 months ago
Metadata Files
Readme Funding License

README.md

Point Cloud Utils Logo Point Cloud Utils Teaser <!--

A Python library for common tasks on 3D point clouds and meshes

-->

Point Cloud Utils is an easy-to-use Python library for processing and manipulating 3D point clouds and meshes.

Documentation


build workflow

Author: Francis Williams

If Point Cloud Utils contributes to an academic publication, cite it as: ``` @misc{point-cloud-utils, title = {Point Cloud Utils}, author = {Francis Williams}, note = {https://www.github.com/fwilliams/point-cloud-utils}, year = {2022} }

```

Point Cloud Utils (pcu) is a utility library providing the following functionality for 3D processing point clouds and triangle meshes. See the Examples section for documentation on how to use these: - Utility functions for reading and writing many common mesh formats (PLY, STL, OFF, OBJ, 3DS, VRML 2.0, X3D, COLLADA). If it can be imported into MeshLab, we can read it! - A series of algorithms for generating point samples on meshes: - Poisson-Disk-Sampling of a mesh based on "Parallel Poisson Disk Sampling with Spectrum Analysis on Surface". - Sampling a mesh with Lloyd's algorithm. - Monte-Carlo sampling on a mesh. - Utilities for downsampling point clouds: - To satisfy a blue noise distribution - On a voxel grid - Closest points between a point cloud and a mesh - Normal estimation from point clouds and triangle meshes - Fast k-nearest-neighbor search between point clouds (based on nanoflann). - Hausdorff distances between point-clouds. - Chamfer distances between point-clouds. - Approximate Wasserstein distances between point-clouds using the Sinkhorn method. - Compute signed distances between a point cloud and a mesh using Fast Winding Numbers - Compute closest points on a mesh to a point cloud - Deduplicating point clouds and mesh vertices - Fast ray/mesh intersection using embree - Fast ray/surfel intersection using embree - Mesh smoothing - Mesh connected components - Mesh decimation - Removing duplicate/unreferenced vertices in point clouds and meshes - Making a mesh watertight (based on the Watertight Manifold algorithm)

Installation

pip install point-cloud-utils

Examples

List of examples

Loading meshes and point clouds

Point-Cloud-Utils supports reading many common mesh formats (PLY, STL, OFF, OBJ, 3DS, VRML 2.0, X3D, COLLADA). If it can be imported into MeshLab, we can read it! The type of file is inferred from its file extension.

If you only need a few attributes of a point cloud or mesh, the quickest way to load a mesh is using one of the read_mesh_* utility functions ```python import pointcloudutils as pcu

Load vertices and faces for a mesh

v, f = pcu.loadmeshvf("path/to/mesh")

Load vertices and per-vertex normals

v, n = pcu.loadmeshvn("path/to/mesh")

Load vertices, per-vertex normals, and per-vertex-colors

v, n, c = pcu.loadmeshvnc("path/to/mesh")

Load vertices, faces, and per-vertex normals

v, f, n = pcu.loadmeshvfn("path/to/mesh")

Load vertices, faces, per-vertex normals, and per-vertex colors

v, f, n, c = pcu.loadmeshvfnc("path/to/mesh") ```

For meshes and point clouds with more complex attributes, use load_triangle_mesh which returns a TriangleMesh object.

```python import pointcloudutils as pcu

mesh is a lightweight TriangleMesh container object holding mesh vertices, faces, and their attributes.

Any attributes which aren't loaded (because they aren't present in the file) are set to None.

The data in TriangleMesh is layed out as follows (run help(pcu.TriangleMesh) for more details):

TriangleMesh:

vertex_data:

positions: [V, 3]-shaped numpy array of per-vertex positions

normals: [V, 3]-shaped numpy array of per-vertex normals (or None)

texcoords: [V, 2]-shaped numpy array of per-vertex uv coordinates (or None)

tex_ids: [V,]-shaped numpy array of integer indices into TriangleMesh.textures indicating which texture to

use at this vertex (or None)

colors: [V, 4]-shaped numpy array of per-vertex RBGA colors in 0.0, 1.0

radius: [V,]-shaped numpy array of per-vertex curvature radii (or None)

quality: [V,]-shaped numpy array of per-vertex quality measures (or None)

flags: [V,]-shaped numpy array of 32-bit integer flags per vertex (or None)

face_data:

vertexids: [F, 3]-shaped numpy array of integer face indices into TrianglMesh.vertexdata.positions

normals: [F, 3]-shaped numpy array of per-face normals (or None)

colors: [F, 4]-shaped numpy array of per-face RBGA colors in 0.0, 1.0

quality: [F,]-shaped numpy array of per-face quality measures (or None)

flags: [F,]-shaped numpy array of 32-bit integer flags per face (or None)

wedge_colors: [F, 3, 4]-shaped numpy array of per-wedge RBGA colors in 0.0, 1.0

wedge_normals: [F, 3, 3]-shaped numpy array of per-wedge normals (or None)

wedge_texcoords: [F, 3, 2]-shaped numpy array of per-wedge] uv coordinates (or None)

wedgetexids: [F, 3]-shaped numpy array of integer indices into TriangleMesh.textures indicating which

texture to use at this wedge (or None)

textures: A list of paths to texture image files for this mesh

normal_maps: A list of paths to texture image files for this mesh

mesh = pcu.loadtrianglemesh("path/to/mesh")

You can also load a mesh directly using the TriangleMesh class

mesh = pcu.TriangleMesh("path/to/mesh") ```

For meshes and point clouds with more complex attributes, use save_triangle_mesh which accepts a whole host of named arguments which control the attributes to save. ```python import pointcloudutils as pcu

savetrianglemesh accepts a path to save to (The type of mesh saved is determined by the file extesion),

an array of mesh vertices of shape [V, 3], and optional arguments specifying faces, per-mesh attributes,

per-face attributes and per-wedge attributes:

filename : Path to the mesh to save. The type of file will be determined from the file extension.

v : [V, 3]-shaped numpy array of per-vertex positions

f : [F, 3]-shaped numpy array of integer face indices into TrianglMesh.vertex_data.positions (or None)

vn : [V, 3]-shaped numpy array of per-vertex normals (or None)

vt : [V, 2]-shaped numpy array of per-vertex uv coordinates (or None)

vc : [V, 4]-shaped numpy array of per-vertex RBGA colors in 0.0, 1.0

vq : [V,]-shaped numpy array of per-vertex quality measures (or None)

vr : [V,]-shaped numpy array of per-vertex curvature radii (or None)

vti : [V,]-shaped numpy array of integer indices into TriangleMesh.textures indicating which texture to

use at this vertex (or None)

vflags : [V,]-shaped numpy array of 32-bit integer flags per vertex (or None)

fn : [F, 3]-shaped numpy array of per-face normals (or None)

fc : [F, 4]-shaped numpy array of per-face RBGA colors in 0.0, 1.0

fq : [F,]-shaped numpy array of per-face quality measures (or None)

fflags : [F,]-shaped numpy array of 32-bit integer flags per face (or None)

wc : [F, 3, 4]-shaped numpy array of per-wedge RBGA colors in 0.0, 1.0

wn : [F, 3, 3]-shaped numpy array of per-wedge normals (or None)

wt : [F, 3, 2]-shaped numpy array of per-wedge] uv coordinates (or None)

wti : [F, 3]-shaped numpy array of integer indices into TriangleMesh.textures indicating which

textures : A list of paths to texture image files for this mesh

normal_maps : A list of paths to texture image files for this mesh

pcu.savetrianglemesh("path/to/mesh", v=v, f=f, vn=vertexnormals, vc=vertexcolors, fn=face_normals)

You can also directly save a pcu.TrianglMesh object

mesh.save("path/to/mesh") ```

Saving meshes and point clouds

Point-Cloud-Utils supports writing many common mesh formats (PLY, STL, OFF, OBJ, 3DS, VRML 2.0, X3D, COLLADA). If it can be imported into MeshLab, we can read it! The type of file is inferred from its file extension.

If you only need to write few attributes of a point cloud or mesh, the quickest way to use the save_mesh_* functions ```python import pointcloudutils as pcu

Assume v, f, n, c are numpy arrays

where

v are the mesh vertices of shape [V, 3]

f are the mesh face indices into v of shape [F, 3]

n are the mesh per-vertex normals of shape [V, 3]

c are the mesh per-vertex colors of shape [V, 4]

v, f, n, c = pcu.loadmeshvfnc("input_mesh.ply")

Save mesh vertices and faces

pcu.savemeshvf("path/to/mesh", v, f)

Save mesh vertices and per-vertex normals

v, n = pcu.savemeshvn("path/to/mesh", v, n)

Save mesh vertices, per-vertex normals, and per-vertex-colors

v, n, c = pcu.savemeshvnc("path/to/mesh", v, n, c)

Save mesh vertices, faces, and per-vertex normals

v, f, n = pcu.savemeshvfn("path/to/mesh", v, f, n)

Save vertices, faces, per-vertex normals, and per-vertex colors

v, f, n, c = pcu.savemeshvfnc("path/to/mesh", v, f, n, c) ```

Generating blue-noise samples on a mesh with Poisson-disk sampling

Generate 10000 samples on a mesh with poisson disk samples ```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

n is a nv by 3 NumPy array of vertex normals

v, f, n = pcu.loadmeshvfn("my_model.ply")

Generate 10000 samples on a mesh with poisson disk samples

f_i are the face indices of each sample and bc are barycentric coordinates of the sample within a face

fi, bc = pcu.samplemeshpoissondisk(v, f, n, 10000)

Use the face indices and barycentric coordinate to compute sample positions and normals

vpoisson = pcu.interpolatebarycentriccoords(f, fi, bc, v) npoisson = pcu.interpolatebarycentriccoords(f, fi, bc, n) ```

Generate blue noise samples on a mesh separated by approximately 0.01 times the bounding box diagonal ```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

n is a nv by 3 NumPy array of vertex normals

v, f, n = pcu.loadmeshvfn("my_model.ply")

Generate samples on a mesh with poisson disk samples seperated by approximately 0.01 times

the length of the bounding box diagonal

bbox = np.max(v, axis=0) - np.min(v, axis=0) bbox_diag = np.linalg.norm(bbox)

f_i are the face indices of each sample and bc are barycentric coordinates of the sample within a face

fi, bc = pcu.samplemeshpoissondisk(v, f, n, 10000)

Use the face indices and barycentric coordinate to compute sample positions and normals

vsampled = pcu.interpolatebarycentriccoords(f, fi, bc, v) nsampled = pcu.interpolatebarycentriccoords(f, fi, bc, n) ```

Generate random samples on a mesh

```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

n is a nv by 3 NumPy array of vertex normals

v, f, n = pcu.loadmeshvfn("my_model.ply")

Generate random samples on the mesh (v, f, n)

f_i are the face indices of each sample and bc are barycentric coordinates of the sample within a face

fi, bc = pcu.samplemeshrandom(v, f, numsamples=v.shape[0] * 40)

Use the face indices and barycentric coordinate to compute sample positions and normals

vsampled = pcu.interpolatebarycentriccoords(f, fi, bc, v) nsampled = pcu.interpolatebarycentriccoords(f, fi, bc, n) ```

Downsample a point cloud to have a blue noise distribution

```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

n is a nv by 3 NumPy array of vertex normals

v, n = pcu.loadmeshvn("my_model.ply")

Downsample a point cloud so that all the points are separated by approximately a fixed value

i.e. the downsampled points follow a blue noise distribution

idx is an array of integer indices into v indicating which samples to keep

radius = 0.01
idx = pcu.downsamplepointcloudpoissondisk(v, radius)

Use the indices to get the sample positions and normals

vsampled = v[idx] nsampled = n[idx] ```

Downsample a point cloud on a voxel grid

Simple downsampling within the bounding box of a point cloud ```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

n is a nv by 3 NumPy array of vertex normals

c is a nv by 4 NumPy array of vertex colors

v, n, c = pcu.loadmeshvnc("my_model.ply")

We'll use a voxel grid with 128 voxels per axis

numvoxelsper_axis = 128

Size of the axis aligned bounding box of the point cloud

bbox_size = v.max(0) - v.min(0)

The size per-axis of a single voxel

sizeofvoxel = bboxsize / numvoxelsper_axis

Downsample a point cloud on a voxel grid so there is at most one point per voxel.

Any arguments after the points are treated as attribute arrays and get averaged within each voxel

vsampled, nsampled, csampled = pcu.downsamplepointcloudonvoxelgrid(sizeof_voxel, v, n, c) ```

Specifying the location of the voxel grid in space (e.g. to only consider points wihtin a sub-region of the point cloud) ```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

n is a nv by 3 NumPy array of vertex normals

c is a nv by 4 NumPy array of vertex colors

v, n, c = pcu.loadmeshvnc("my_model.ply")

We'll use a voxel grid with 128 voxels per axis

numvoxelsper_axis = 128

Size of the axis aligned bounding box of the point cloud

bbox_size = v.max(0) - v.min(0)

Let's say we only want to consider points in the top right corner of the bounding box

domainmin = v.min(0) + bboxsize / 2.0 domainmax = v.min(0) + bboxsize

The size per-axis of a single voxel

sizeofvoxel = bboxsize / numvoxelsper_axis

Downsample a point cloud on a voxel grid so there is at most one point per voxel.

Multiple points, normals, and colors within a voxel cell are averaged together.

minbound and maxbound specify a bounding box in which we will downsample points

vsampled, nsampled, csampled = pcu.downsamplepointcloudvoxelgrid(sizeofvoxel, v, n, c, minbound=domainmin, maxbound=domainmax) ```

Discarding voxels with too few points ```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

n is a nv by 3 NumPy array of vertex normals

c is a nv by 4 NumPy array of vertex colors

v, n, c = pcu.loadmeshvnc("my_model.ply")

We'll use a voxel grid with 128 voxels per axis

numvoxelsper_axis = 128

Size of the axis aligned bounding box of the point cloud

bbox_size = v.max(0) - v.min(0)

The size per-axis of a single voxel

sizeofvoxel = bboxsize / numvoxelsper_axis

We will throw away points within voxel cells containing fewer than 3 points

minpointsper_voxel = 3

Downsample a point cloud on a voxel grid so there is at most one point per voxel.

Multiple points, normals, and colors within a voxel cell are averaged together.

vsampled, nsampled, csampled = pcu.downsamplepointcloudvoxelgrid(sizeofvoxel, v, n, c, minpointspervoxel=minpointspervoxel) ```

Compute closest points on a mesh

```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

v, f = pcu.loadmeshvf("my_model.ply")

Generate 1000 random query points. We will find the closest point on the mesh for each of these

p = np.random.rand(1000, 3)

For each query point, find the closest point on the mesh.

Here:

- d is an array of closest distances for each query point with shape (1000,)

- fi is an array of closest face indices for each point with shape (1000,)

- bc is an array of barycentric coordinates within each face (shape (1000, 3)

of the closest point for each query point

d, fi, bc = pcu.closestpointson_mesh(p, v, f)

Convert barycentric coordinates to 3D positions

closestpoints = pcu.interpolatebarycentric_coords(f, fi, bc, v) ```

Estimating normals from a point cloud

```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

v = pcu.loadmeshv("my_model.ply")

Estimate a normal at each point (row of v) using its 16 nearest neighbors

n = pcu.estimatepointcloudnormalsknn(v, 16)

Estimate a normal at each point (row of v) using its neighbors within a 0.1-radius ball

n = pcu.estimatepointcloudnormalsball(v, 0.1) ```

Computing mesh normals per vertex

```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

Estimate per-vertex normal using the average of adjacent face normals

n is a NumPy array of shape [nv, 3] where n[i] is the normal of vertex v[i]

n = pcu.estimatemeshvertex_normals(v, f) ```

Computing mesh normals per face

```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

Estimate per-face normal using the average of adjacent face normals

n is a NumPy array of shape [nf, 3] where n[i] is the normal of face f[i]

n = pcu.estimatemeshface_normals(v, f) ```

Consistently orienting faces of a mesh

```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

Re-orient faces in a mesh so they are consistent within each connected component

f_orient is a (nf, 3)-shaped array of re-oriented faces indexes into v

fcompids is a (nf,)-shaped array of component ids for each face

i.e. fcompids[i] is the connected component id of face fi

foriented, fcompids = pcu.orientmesh_faces(f) ```

Approximate Wasserstein (Sinkhorn) distance between two point clouds

```python import pointcloudutils as pcu import numpy as np

a and b are arrays where each row contains a point

Note that the point sets can have different sizes (e.g [100, 3], [111, 3])

a = np.random.rand(100, 3) b = np.random.rand(100, 3)

M is a 100x100 array where each entry (i, j) is the L2 distance between point a[i, :] and b[j, :]

M = pcu.pairwise_distances(a, b)

wa and wb are masses assigned to each point. In this case each point is weighted equally.

wa = np.ones(a.shape[0]) wb = np.ones(b.shape[0])

P is the transport matrix between a and b, eps is a regularization parameter, smaller epsilons lead to

better approximation of the true Wasserstein distance at the expense of slower convergence

P = pcu.sinkhorn(wa, wb, M, eps=1e-3)

To get the distance as a number just compute the frobenius inner product

sinkhorn_dist = (M*P).sum() ```

Chamfer distance between two point clouds

```python import pointcloudutils as pcu import numpy as np

a and b are arrays where each row contains a point

Note that the point sets can have different sizes (e.g [100, 3], [111, 3])

a = np.random.rand(100, 3) b = np.random.rand(100, 3)

chamferdist = pcu.chamferdistance(a, b) ```

Hausdorff distance between two point clouds

```python import pointcloudutils as pcu import numpy as np

Generate two random point sets

a = np.random.rand(1000, 3) b = np.random.rand(500, 3)

Compute one-sided squared Hausdorff distances

hausdorffatob = pcu.onesidedhausdorffdistance(a, b) hausdorffbtoa = pcu.onesidedhausdorffdistance(b, a)

Take a max of the one sided squared distances to get the two sided Hausdorff distance

hausdorffdist = pcu.hausdorffdistance(a, b)

Find the index pairs of the two points with maximum shortest distancce

hausdorffbtoa, idxb, idxa = pcu.onesidedhausdorffdistance(b, a, returnindex=True) assert np.abs(np.sum((a[idxa] - b[idxb])**2) - hausdorffbtoa**2) < 1e-5, "These values should be almost equal"

Find the index pairs of the two points with maximum shortest distancce

hausdorffdist, idxb, idxa = pcu.hausdorffdistance(b, a, returnindex=True) assert np.abs(np.sum((a[idxa] - b[idxb])**2) - hausdorffdist**2) < 1e-5, "These values should be almost equal"

```

K-nearest-neighbors between two point clouds

```python import pointcloudutils as pcu import numpy as np

Generate two random point sets

ptsa = np.random.rand(1000, 3) ptsb = np.random.rand(500, 3)

k = 10

distsatob is of shape (ptsa.shape[0], k) and contains the (sorted) distances

to the k nearest points in pts_b

corrsatob is of shape (a.shape[0], k) and contains the index into ptsb of the

k closest points for each point in pts_a

distsatob, corrsatob = pcu.knearestneighbors(ptsa, ptsb, k) ```

Generating point samples in the square and cube with Lloyd relaxation

```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

Generate 1000 points on the mesh with Lloyd's algorithm

samples = pcu.samplemeshlloyd(v, f, 1000)

Generate 100 points on the unit square with Lloyd's algorithm

samples2d = pcu.lloyd2d(100)

Generate 100 points on the unit cube with Lloyd's algorithm

samples3d = pcu.lloyd3d(100) ```

Compute shortest signed distances to a triangle mesh with fast winding numbers

```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

Generate 1000 points in the volume around the mesh. We'll compute the signed distance to the

mesh at each of these points

pts = np.random.rand(1000, 3) * (v.max(0) - v.min(0)) + v.min(0)

Compute the sdf, the index of the closest face in the mesh, and the barycentric coordinates of

closest point on the mesh, for each point in pts

sdfs, faceids, barycentriccoords = pcu.signeddistanceto_mesh(pts, v, f) ```

Deduplicating Point Clouds and Meshes

Point Clouds:

```python import pointcloudutils as pcu

p is a (n, 3)-shaped array of points (one per row)

p is a (n, 3)-shaped array of normals at each point

p, n = pcu.loadmeshvn("my_pcloud.ply")

Treat any points closer than 1e-7 apart as the same point

idxi is an array of indices such that pdedup = p[idx_i]

idxj is an array of indices such that p = pdedup[idx_j]

pdedup, idxi, idxj = pcu.deduplicatepoint_cloud(p, 1e-7)

Use idx_i to deduplicate the normals

ndedup = n[idxi] ```

Meshes:

```python import pointcloudutils as pcu

v is a (nv, 3)-shaped NumPy array of vertices

f is an (nf, 3)-shaped NumPy array of face indexes into v

c is a (nv, 4)-shaped numpy array of per-vertex colors

v, f, c = pcu.loadmeshvfc("my_model.ply")

Treat any points closer than 1e-7 apart as the same point

idxi is an array of indices such that vdedup = v[idx_i]

idxj is an array of indices such that v = vdedup[idx_j]

vdedup, fdedup, idxi, idxj = pcu.deduplicatemeshvertices(v, f, 1e-7)

Use idx_i to deduplicate the colors

cdedup = c[idxi] ```

Removing unreferenced mesh vertices

```python import pointcloudutils as pcu

v is a (nv, 3)-shaped NumPy array of vertices

f is an (nf, 3)-shaped NumPy array of face indexes into v

c is a (nv, 4)-shaped numpy array of per-vertex colors

v, f, c = pcu.loadmeshvfc("my_model.ply")

Treat any points closer than 1e-7 apart as the same point

idx_v is an array of indices mapping each vertex in the output mesh to its index in the input

idx_f is an array of indices mapping each face in the output mesh to its index in the input

vclean, fclean, idxv, idxf = pcu.removeunreferencedmesh_vertices(v, f)

cclean = c[idxv] ```

Calculating face areas of a mesh

```python import pointcloudutils as pcu

v is a (nv, 3)-shaped NumPy array of vertices

f is an (nf, 3)-shaped NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

Compute areas of each face, face_areas[i] is the area of face f[i]

faceareas = pcu.meshface_areas

Remove faces with small areas

fnew = f[faceareas < 1e-4] ```

Smoothing a Mesh

```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

numiters = 3 # Number of smoothing iterations usecotan_weights = True # Whether to use cotangent weighted laplacian

vsmooth contains the vertices of the smoothed mesh (the new mesh has the same face indices f)

vsmooth = pcu.laplaciansmoothmesh(v, f, numiters, usecotanweights=usecotan_weights) ```

Computing connected components

```python import pointcloudutils as pcu import numpy as np

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

cv is the index of the connected component of each vertex

nv is the number of vertices per component

cf is the index of the connected component of each face

nf is the number of faces per connected component

cv, nv, cf, nf = pcu.connected_components(v, f)

Extract mesh of connected component with most faces

compmax = np.argmax(nf) vmax, fmax, _, _ = pcu.removeunreferencedmeshvertices(v, f[cf == comp_max]) ```

Decimating a triangle mesh

```python import pointcloudutils as pcu

v, f = pcu.loadmeshvf("mymesh.ply") targetnumfaces = f.shape[0] // 10 # Downsample by a factor of 10

vdecimate, fdecimate are the vertices/faces of the decimated mesh

vcorrespondence, fcorrespondence are the vertices and faces in the dense mesh which generated each

downsampled vertex/face

vdecimate, fdecimate, vcorrespondence, fcorrespondence = pcu.decimatetrianglemesh(v, f, targetnumfaces) pcu.savemeshvf("decimated.ply", vdecimate, fdecimate) ```

Making a Mesh Watertight

```python import pointcloudutils as pcu

v is a nv by 3 NumPy array of vertices

f is an nf by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvf("my_model.ply")

Optional resolution parameter (default is 20_000).

See https://github.com/hjwdzh/Manifold for details

resolution = 20000 vwatertight, fwatertight = pcu.makemesh_watertight(v, f, resolution=resolution) ```

Ray/Mesh Intersection

```python import pointcloudutils as pcu import numpy as np

v is a #v by 3 NumPy array of vertices

f is an #f by 3 NumPy array of face indexes into v

c is a #v by 4 array of vertex colors

v, f, c = pcu.loadmeshvfc("my_model.ply")

Generate rays on an image grid

uv = np.stack([a.ravel() for a in np.mgrid[-1:1:128j, -1.:1.:128j]], axis=-1) rayd = np.concatenate([uv, np.ones([uv.shape[0], 1])], axis=-1) rayd = rayd / np.linalg.norm(rayd, axis=-1, keepdims=True) rayo = np.array([[2.5, 0, -55.0] for _ in range(rayd.shape[0])])

Intersect rays with geometry

intersector = pcu.RayMeshIntersector(v, f)

fid is the index of each face intersected (-1 for ray miss)

bc are the barycentric coordinates of each intersected ray

t are the distances from the ray origin to the intersection for each ray (inf for ray miss)

fid, bc, t = intersector.intersectrays(rayo, ray_d)

Get intersection positions and colors by interpolating on the faces

hitmask = np.isfinite(t) hitpos = pcu.interpolatebarycentriccoords(f, fid[hitmask], bc[hitmask], v) hitclr = pcu.interpolatebarycentriccoords(f, fid[hitmask], bc[hit_mask], c) ```

Ray/Surfel Intersection

```python import pointcloudutils as pcu import numpy as np

v is a #v by 3 NumPy array of vertices

n is a #v by 3 NumPy array of vertex normals

v, n = pcu.loadmeshvn("my_model.ply")

Generate rays on an image grid

uv = np.stack([a.ravel() for a in np.mgrid[-1:1:128j, -1.:1.:128j]], axis=-1) rayd = np.concatenate([uv, np.ones([uv.shape[0], 1])], axis=-1) rayd = rayd / np.linalg.norm(rayd, axis=-1, keepdims=True) rayo = np.array([[2.5, 0, -55.0] for _ in range(rayd.shape[0])])

Intersect rays with surfels with fixed radius 0.55

intersector = pcu.RaySurfelIntersector(v, n, r=0.55)

pid is the index of each point intersected by a ray

t are the distances from the ray origin to the intersection for each ray (inf for ray miss)

pid, t = intersector.intersectrays(rayo, ray_d)

Get points intersected by rays

hitmask = pid >= 0 intersectedpoints = v[pid[hit_mask]] ```

Computing curvature on a mesh

```python import pointcloudutils as pcu

v is a #v by 3 NumPy array of vertices

f is an #f by 3 NumPy array of face indexes into v

v, f = pcu.loadmeshvfc("my_model.ply")

Compute principal min/max curvature magnitudes (k1, k2) and directions (d1, d2)

using the one ring of each vertex

k1, k2, d1, d2 = pcu.meshprincipalcurvatures(v, f)

Compute principal min/max curvature magnitudes (k1, k2) and directions (d1, d2)

using a radius. This method is much more robust but requires tuning the radius

k1, k2, d1, d2 = pcu.meshprincipalcurvatures(v, f, r=0.1)

Compute Mean (kh) and Gaussian (kg) curvatures using the one ring of each vertex

kh, kg = pcu.meshmeanandgaussiancurvatures(v, f)

Compute Mean (kh) and Gaussian (kg) curvatures using using a radius.

This method is much more robust but requires tuning the radius

kh, kg = pcu.meshmeanandgaussiancurvatures(v, f, r=0.1) ```

Computing a consistent inside and outside for a triangle soup

```python import pointcloudutils as pcu import numpy as np

v, f = pcu.loadmeshvf("my_model.ply")

We're going to evaluate the inside/outside sign of 1000 points

p = np.random.rand(1000, 3)

w has shape (1000,) where w[i] is the sign (positive for outside, negative for inside) of p[i]

w = pcu.trianglesoupfastwindingnumber(v, f, p.astype(v.dtype)) ```

Voxelizing a triangle mesh

You can get a list of voxels which intersect a mesh as follows: ```python import pointcloudutils as pcu

v, f = pcu.loadmeshvf("mesh.ply") # Load some mesh

voxelsize = 1.0 / 128 # size of each voxel voxelorigin = [0., 0., 0.] # Coordinate mapping to the bottom-left-back corner of the (0, 0, 0) voxel

[num_vox, 3] array of integer coordinates for each voxel intersecting the mesh

ijk = pcu.voxelizetrianglemesh(v, f, voxelsize, voxelorigin) ```

Flood filling a dense grid

If you have a 3D grid, you can flood fill it starting from a coordinate as follows: ```python import pointcloudutils as pcu

Grid of 0/1 values (but we also support floats/doubles/etc...)

grid = (np.random.rand([128, 128, 128]) > 0.5).astype(np.int32)

fill_value = 2 # Fill starting from [0, 0, 0] with the value 2

pcu.floodfill3d(grid, [0, 0, 0], fill_value) ```

Generating a mesh for a voxel grid

Suppose you an array ijk of integer voxel coordinates. You may wish to plot the associated voxel grid. You can do this via the voxel_grid_geometry function as follows

```python import pointcloudutils as pcu

voxelsize = 1.0 / 200.0 # Size of each voxel voxelorigin = [0.0, 0.0, 0.0] # The position of the bottom-back-left corner of the (0, 0, 0) voxel

gap_fraction = 0.01 # Generate an optional small gap between voxels which can look nice -- this is a fraction of the voxel size

ijk = np.random.randint(-100, 100, size=(128, 3)) # Generate 128 random voxels in [-100, 100]^3

voxv, voxf are vertices/faces of mesh for voxel grid

voxv, voxf = pcu.voxelgridgeoemtry(ijk, v, voxelsize=voxelsize, voxelorigin=voxelorigin, gapfraction=gapfraction) ```

Owner

  • Name: Francis Williams
  • Login: fwilliams
  • Kind: user
  • Location: New York, NY
  • Company: NVIDIA

I'm a research scientist at NVIDIA working on 3D deep learning.

GitHub Events

Total
  • Create event: 4
  • Release event: 2
  • Issues event: 9
  • Watch event: 150
  • Member event: 1
  • Issue comment event: 17
  • Push event: 24
  • Pull request review event: 2
  • Pull request event: 9
  • Fork event: 13
Last Year
  • Create event: 4
  • Release event: 2
  • Issues event: 9
  • Watch event: 150
  • Member event: 1
  • Issue comment event: 17
  • Push event: 24
  • Pull request review event: 2
  • Pull request event: 9
  • Fork event: 13

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 530
  • Total Committers: 14
  • Avg Commits per committer: 37.857
  • Development Distribution Score (DDS): 0.064
Past Year
  • Commits: 46
  • Committers: 6
  • Avg Commits per committer: 7.667
  • Development Distribution Score (DDS): 0.239
Top Committers
Name Email Commits
Francis Williams f****s@f****o 496
Layer3 5****3 8
Matthew Cong m****g@n****m 6
maurock m****2@g****m 5
Anthony a****i@g****m 3
Chris Barnes c****s@m****k 2
Misha S 4****e 2
Riccardo de Lutio r****o@n****m 2
AppledoreM a****g@g****m 1
David Caron d****5@g****m 1
Ikko Eltociear Ashimine e****r@g****m 1
Ussama Naal u****l@o****o 1
augusto-de-freitas-isi 1****i 1
dependabot[bot] 4****] 1
Committer Domains (Top 20 + Academic)

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 84
  • Total pull requests: 28
  • Average time to close issues: 3 months
  • Average time to close pull requests: about 1 month
  • Total issue authors: 63
  • Total pull request authors: 14
  • Average comments per issue: 3.38
  • Average comments per pull request: 1.21
  • Merged pull requests: 24
  • Bot issues: 0
  • Bot pull requests: 1
Past Year
  • Issues: 9
  • Pull requests: 12
  • Average time to close issues: 3 months
  • Average time to close pull requests: 8 days
  • Issue authors: 9
  • Pull request authors: 5
  • Average comments per issue: 0.89
  • Average comments per pull request: 0.17
  • Merged pull requests: 11
  • Bot issues: 0
  • Bot pull requests: 1
Top Authors
Issue Authors
  • fwilliams (11)
  • gattia (4)
  • maurock (3)
  • nimajam41 (2)
  • ghost (2)
  • Seikegn (2)
  • gursi26 (2)
  • cravisjan97 (2)
  • ShairozS (2)
  • bearinsuke (1)
  • nicolocarissimi (1)
  • utacc (1)
  • EAST-J (1)
  • rahul28suresh (1)
  • etaoxing (1)
Pull Request Authors
  • fwilliams (9)
  • matthewdcong (6)
  • Samahu (2)
  • augusto-de-freitas-isi (2)
  • dependabot[bot] (2)
  • riccardodelutio (2)
  • clbarnes (1)
  • Layer3 (1)
  • mishasweetpie (1)
  • davidcaron (1)
  • eltociear (1)
  • AppledoreM (1)
  • maurock (1)
  • NarinderS (1)
  • gattia (1)
Top Labels
Issue Labels
Pull Request Labels
dependencies (2)

Packages

  • Total packages: 3
  • Total downloads:
    • pypi 26,225 last-month
  • Total docker downloads: 654
  • Total dependent packages: 5
    (may contain duplicates)
  • Total dependent repositories: 10
    (may contain duplicates)
  • Total versions: 53
  • Total maintainers: 1
pypi.org: point-cloud-utils

A Python library for common tasks on 3D point clouds and meshes

  • Versions: 20
  • Dependent Packages: 5
  • Dependent Repositories: 9
  • Downloads: 26,225 Last month
  • Docker Downloads: 654
Rankings
Stargazers count: 1.9%
Docker downloads count: 2.1%
Dependent packages count: 3.2%
Average: 3.5%
Downloads: 4.3%
Forks count: 4.6%
Dependent repos count: 4.8%
Maintainers (1)
Last synced: 6 months ago
pypi.org: pypcu

A Python Library of utilities for point clouds

  • Versions: 4
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 0
Rankings
Stargazers count: 2.4%
Dependent packages count: 4.8%
Forks count: 5.3%
Dependent repos count: 6.3%
Average: 15.0%
Downloads: 56.0%
Last synced: about 1 year ago
conda-forge.org: point_cloud_utils
  • Versions: 29
  • Dependent Packages: 0
  • Dependent Repositories: 1
Rankings
Stargazers count: 14.1%
Forks count: 20.5%
Dependent repos count: 24.4%
Average: 27.7%
Dependent packages count: 51.6%
Last synced: 6 months ago

Dependencies

setup.py pypi
  • numpy *
  • scipy *
.github/workflows/build-wheels-and-publish-to-pipy.yml actions
  • actions/checkout v3 composite
  • actions/checkout v2 composite
  • actions/download-artifact v2 composite
  • actions/upload-artifact v3 composite
  • actions/upload-artifact v2 composite
  • docker/setup-qemu-action v2 composite
  • pypa/cibuildwheel v2.11.4 composite
  • pypa/gh-action-pypi-publish v1.4.2 composite
pyproject.toml pypi