├── ExampleMasks
├── fillet_sq_example_mask.png
├── python_model_mask_v0.png
├── python_model_mask_v1A.png
└── python_model_mask_v2.png
├── Figures
├── version0_example.gif
├── version0_example_2.gif
├── version0_example_3.gif
├── version1_example.png
└── version2_example.gif
├── README.md
├── etch_model_version0.py
├── etch_model_version1.py
├── etch_model_version3_cellular_automata.py
└── etch_sim_utilities.py
/ExampleMasks/fillet_sq_example_mask.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/ExampleMasks/fillet_sq_example_mask.png
--------------------------------------------------------------------------------
/ExampleMasks/python_model_mask_v0.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/ExampleMasks/python_model_mask_v0.png
--------------------------------------------------------------------------------
/ExampleMasks/python_model_mask_v1A.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/ExampleMasks/python_model_mask_v1A.png
--------------------------------------------------------------------------------
/ExampleMasks/python_model_mask_v2.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/ExampleMasks/python_model_mask_v2.png
--------------------------------------------------------------------------------
/Figures/version0_example.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/Figures/version0_example.gif
--------------------------------------------------------------------------------
/Figures/version0_example_2.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/Figures/version0_example_2.gif
--------------------------------------------------------------------------------
/Figures/version0_example_3.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/Figures/version0_example_3.gif
--------------------------------------------------------------------------------
/Figures/version1_example.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/Figures/version1_example.png
--------------------------------------------------------------------------------
/Figures/version2_example.gif:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/cococastano/PythonDryEtchModel/ea84d827dd8acf0e0402ab63deb4c568aae5f227/Figures/version2_example.gif
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # PythonDryEtchingTool
2 | Code was developed for SNF at Stanford University in conjunction with the E241 course. These scripts are meant to predict etch profiles from dry silicon etching based on know etch rates. Three versions are provided (version 2 is the most up-to-date):
3 |
4 | * Version 2: Cellular automata implementation of dry silicon etching. Cells in a 3D grid are intialized with a state value of 1. As etching occures, the cell state is subtracted from at a user-defined rate that, for isotropic steps, is adjusted by the calculated angle between the surface normals and the center vertical axis of the etched feature. The code tracks three containers of cells: (1) dictionary of exposed cells with keys that are cell center tuples and values that define various attributes of each cells (i.e., surface normal, neighbors, state); (2) set of neighbor cells with cell center tuples of every neighbors that is defined in exposed cells; (3) set of removed cells that holds cell center tuples of cells that are etched away (state < 0). Etch conditions and simulation parameters are prescribed in USER INPUTS (clearly marked) in the main script (etch_model_version3_cellular_automata.py). All necessary methods are available in etch_sim_utilities.py. In the main script, the user should edit recipe_steps. For steps that are strictly Bosch or isotropic etching, values under 'iso' or 'bosch' should be None, respectively. The user should define etch rates functions: vert_rate, horiz_rate, and bosch_vert_step.
5 |
6 | 
7 |
8 | * Version 1 (DOES NOT WORK WELL): More elegant implementation that evolves a surface object with easily recipe steps. This version is tailored for precribing custom bsoch, isotropic, and taperd (combined bosch and isotropic) etching steps. The surface is evolved by using the computed normals of the surface and stepping points back along the normals by some user defined vertical and horizontal etch rates. Because this version used vtk based rendering tools, this model outputs nice interactive renders that can be saved as vtk files for later usage. Below is an example screenshot from a 90 um bosch etch (code under development).
9 |
10 | 
11 |
12 | * Version 0: Simple implementation which utilized polygon and path objects. This version grows down and out layer by layer but neglects to evolve layers constructed at previous timepoints, i.e., a Bosch step at the beginning will grow down but after this layer is formed, subsuquent etching steps will not change its shape/form. This is useful for seeing the result of horizontally moving etch fronts and how they interact with each other.
13 |
14 | 
15 |
16 | ## Environment
17 | Python 3.6 was used for developing these scripts. My personal preference is using Spyder IDE in the Anaconda environment. The solvers rely heavily on some imported python packages, so you environment should have the following installed: openCV (cv2), shapely, and pyvista. I encourage you to familiarize yourself with documentation and install instructions for each package, but in particular pyvista has a number of dependencies, most notably vtk, and some others, for full functionality, include: imageio, appdirs, and meshio.
18 |
19 | All these packages can be directly installed from the command line (i.e., Anaconda command line) with pip. For example:
20 | ```
21 | pip install vtk
22 | pip install opencv-python
23 | ```
24 |
25 | numpy >= 1.19.2
26 | pyvista >= 0.27.4
27 | matplotlib >= 3.3.4
28 | cv2 >= 4.4.0
29 | scipy >= 1.5.2
30 | shapely >= 1.6.4
31 |
32 |
33 | ## Process Flow
34 | Most input expected from the user is specified in the top sections of the code. Most notably the path to a etch mask is required. This is a binary .png file (example shown below).
35 |
36 | 
37 |
38 | Here is will describe the Version 1 code as it is meant to replace Version 0. After the mask is provided, tune the desired etch rates in the vertical and horizontal directions, the curved profile of the isotropic etch will be interpolated. Other parameters can be set, like time step (t_step) or resoltuion of mesh (set_res) that affect solution time.
39 |
40 | ## Future of the Code
41 | I am not a coder, so my code is messy. I would be happy for a savy coder to come along and clean up this work and maybe create more elegant classes. For SNF users, eventually I want there to be enough data that this tool can take in recipe settings (i.e. gas composition and bias voltage) and interpolate horizontal and vertical etch rates.
42 |
43 | ## Contact:
44 | ncastano@stanford.edu
45 |
--------------------------------------------------------------------------------
/etch_model_version0.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib as mpl
3 | import matplotlib.pyplot as plt
4 | import matplotlib.patches as patches
5 | import cv2
6 | from scipy.spatial import distance as dist
7 | from matplotlib.path import Path
8 | from shapely.geometry.polygon import Polygon
9 | from shapely.geometry import Point
10 | from shapely.ops import cascaded_union
11 | from mpl_toolkits.mplot3d import Axes3D
12 | from scipy.interpolate import splprep, splev
13 | mpl.style.use('default')
14 |
15 |
16 |
17 |
18 | """
19 | Written by Nicolas Castano
20 |
21 | Model continuous etching into silicon wafer on the PT-DSE tool in the SNF
22 | based known etch rates.
23 | """
24 |
25 | def horiz_etch(cont,horiz_rate,t_step,norm_span,sm_window):
26 | # method to apply normal step to a contour
27 | out_cont = np.zeros_like(cont)
28 | for p, point in enumerate(cont):
29 | # x_q = point[0] # if you want to plot a quiver for normal lines
30 | # y_q = point[1]
31 | # calculate normal to point
32 | # enable looping index to beginning
33 | if p + norm_span > len(cont)-1:
34 | dummy_p = 0
35 | else:
36 | dummy_p = p
37 | p_1 = cont[dummy_p-norm_span]
38 | p_2 = cont[dummy_p+norm_span]
39 | tan_vec = np.array([[p_2[0]-p_1[0]],
40 | [p_2[1]-p_1[1]]])
41 | norm_vec = np.matmul(rot90_mat,tan_vec)
42 | unit_norm = norm_vec/np.linalg.norm(norm_vec)
43 |
44 | # if t==30:
45 | # ax.quiver(x_q,y_q,unit_norm[0],unit_norm[1],width=0.002)
46 | # ax.plot(cont[:,0],cont[:,1],'o')
47 |
48 | # calculate new point
49 | new_pt = point + horiz_rate*t_step*np.reshape(unit_norm,(1,2))
50 | out_cont[p,:] = new_pt
51 |
52 | # fprce last point to be on top of first in contour
53 | out_cont[-1,0] = out_cont[0,0]
54 | out_cont[-1,1] = out_cont[0,1]
55 | # smooth with spline
56 | tck, u = splprep(out_cont.T, u=None, s=0, per=1)
57 | u_new = np.linspace(u.min(), u.max(), len(cont))
58 | x_spline, y_spline = splev(u_new, tck, der=0)
59 | out_cont = np.hstack((np.reshape(np.array(x_spline),[len(x_spline),1]),
60 | np.reshape(np.array(y_spline),[len(y_spline),1])))
61 |
62 | return out_cont
63 |
64 | #def animate(index):
65 | # zi = ml.griddata(x, y, zlist[index], xi, yi, interp='linear')
66 | # ax.clear()
67 | # ax.contourf(xi, yi, zi, **kw)
68 | # ax.set_title('%03d'%(index))
69 |
70 |
71 |
72 |
73 | C4F8 = 100 # sccm
74 | SF6 = 300 # sccm
75 | bias = 10 # volts
76 | time = 600 # seconds
77 | opening = 100 # um
78 |
79 | plt.close('all')
80 |
81 | # load mask
82 | im_dir = 'C:/Users/nicas/Documents/E241-MicroNanoFab/masks/'
83 | im_file = 'python_model_fil_sq.png'
84 | im_path = im_dir + im_file
85 | curr_im = cv2.imread(im_path, cv2.IMREAD_ANYDEPTH)
86 | curr_im = cv2.GaussianBlur(curr_im,(3,3),0)
87 |
88 |
89 | rgb_im = cv2.cvtColor(curr_im, cv2.COLOR_GRAY2RGB)
90 |
91 | cont_im, conts, hier = cv2.findContours(curr_im, cv2.RETR_LIST, cv2.CHAIN_APPROX_NONE)
92 | conts_im = cv2.drawContours(rgb_im, conts, -1, (0,255,0),3)
93 |
94 |
95 | dummy_i = im_file.find('.png')
96 | out_file = im_dir + im_file[:dummy_i] + '_out' + im_file[dummy_i:]
97 | cv2.imwrite(out_file, conts_im)
98 |
99 |
100 | t_start = 0
101 | t_end = 600 # seconds
102 | t_step = 5
103 | h = curr_im.shape[0]
104 | w = curr_im.shape[1]
105 | n_points = 600
106 | contour_read_step = 5
107 | topo_im = np.zeros_like(curr_im)
108 | norm_span = 3
109 | window_len = 17
110 | rot90_mat = np.array([[np.cos(np.pi/2), -np.sin(np.pi/2)],
111 | [np.sin(np.pi/2), np.cos(np.pi/2)]])
112 | vert_rate = 287/600 # um/s
113 |
114 | horiz_rate = 77/600 # um/s
115 | pixel_um_conv = 251/90.4672 # px/um
116 | cmap = 'gnuplot' # 'inferno' 'viridis' # 'hot'
117 | vmin = -290 # expected range of depth for color bar (min)
118 | vmax = 0
119 | # for plotting srface plot
120 | rstride = 2
121 | cstride = 2
122 |
123 |
124 | x_axis = np.linspace(0,w/pixel_um_conv,n_points)
125 | y_axis = np.linspace(0,h/pixel_um_conv,n_points)
126 | xv,yv = np.meshgrid(x_axis,y_axis)
127 | x_points = np.ravel(xv)
128 | x_points = x_points.reshape((len(x_points),1))
129 | y_points = np.ravel(yv)
130 | y_points = y_points.reshape((len(y_points),1))
131 | grid_point_pairs = np.hstack((x_points,y_points))
132 |
133 |
134 |
135 | # get points for each contour
136 | # tracking paths and polygons
137 | conts_paths = {}
138 | conts_polys = {}
139 | topo_data = {}
140 | unit_norm_vectors = {}
141 |
142 |
143 | #fig, ax = plt.subplots()
144 | for c, cont in enumerate(conts):
145 | x = []
146 | y = []
147 | for p, point in enumerate(cont):
148 | if p%contour_read_step == 0:
149 | x.append(point[0][0]/pixel_um_conv)
150 | y.append(point[0][1]/pixel_um_conv)
151 | # plt.scatter(point[0][0],point[0][1],3,'k')
152 | # plt.text(point[0][0],point[0][1],str(p))
153 |
154 | # force last point to be on top of first point
155 | x[-1] = x[0]
156 | y[-1] = y[0]
157 | # smooth contour with spline
158 | points = np.hstack((np.reshape(np.array(x),[len(x),1]),
159 | np.reshape(np.array(y),[len(y),1])))
160 |
161 | tck, u = splprep(points.T, u=None, s=0.0, per=1)
162 | u_new = np.linspace(u.min(), u.max(), len(cont))
163 | x_spline, y_spline = splev(u_new, tck, der=0)
164 |
165 | points = np.hstack((np.reshape(np.array(x_spline),[len(x_spline),1]),
166 | np.reshape(np.array(y_spline),[len(y_spline),1])))
167 |
168 |
169 | temp_poly = Polygon(points)
170 | temp_path = Path(temp_poly.exterior,closed=True)
171 | conts_paths[c] = temp_path # path object nice for the contains_point attribute
172 | conts_polys[c] = temp_poly
173 |
174 | unit_norm_vectors[c] = np.zeros_like(conts_paths[c].vertices)
175 | topo_data[c] = np.zeros((grid_point_pairs.shape[0],1)) # each point will have a depth
176 |
177 |
178 | # patch = patches.PathPatch(temp_path, facecolor='orange', lw=2)
179 | # ax.add_patch(patch)
180 | #ax.autoscale_view()
181 | #plt.show()
182 |
183 | x = grid_point_pairs[:,0].reshape(xv.shape)
184 | y = grid_point_pairs[:,1].reshape(yv.shape)
185 |
186 | fig2, ax2 = plt.subplots(figsize=(8,7))
187 |
188 | dummy_cont_count = len(conts_paths)
189 |
190 | topo = []
191 |
192 | # solve the etching of the mask
193 | for i_t,t in enumerate(range(t_start, t_end, t_step)):
194 | # vert_rate = (10+2/600*t)/60 # (10 + 0.0000056969697*t**2)/60
195 | # horiz_rate = (3-(-0.1318335/0.04394449)*(1-np.exp(-0.04394449*t)))/60
196 | print('solving time: ', t)
197 | z_mask = np.zeros_like(xv) # 1 if etch back at node, 0 if not
198 | topo.append(np.zeros_like(xv))
199 | cont_loop = True
200 | overlap = False
201 | cummul_paths = {}
202 | c = 0
203 |
204 | # determine the overlapping points in adjacents mask openings
205 | # 0 for no mask opening, 1 for opening
206 | # solving for surface level contour
207 | while cont_loop == True:
208 | # for c in cont_arrays:
209 | print(' checking for OVERLAP with contour: ', c)
210 | # determine if contours overlap
211 | other_conts = list(range(dummy_cont_count))
212 | other_conts.pop(other_conts.index(c))
213 | for oc in other_conts:
214 | for pt in conts_paths[oc].vertices:
215 | if conts_paths[c].contains_point(pt):
216 | overlap = True
217 | break
218 | if overlap == True: break
219 |
220 |
221 | # # stack all contours and convert to binary mask
222 | #
223 | # inside = conts_paths[c].contains_points(grid_point_pairs)
224 | # z_mask += inside.astype(int).reshape(xv.shape)
225 | # z_mask[z_mask>0] = 1
226 |
227 |
228 |
229 |
230 |
231 |
232 |
233 |
234 | # if one overlaps assume all are overlapping
235 | if overlap == True:
236 | print(' overlap detected')
237 | #combine contours
238 | polys = [conts_polys[poly] for poly in list(conts_polys.keys())]
239 | new_cont = cascaded_union([poly if poly.is_valid
240 | else poly.buffer(0) for poly in polys])
241 |
242 | # smooth with spline
243 | try:
244 | x_temp,y_temp = new_cont.exterior.xy
245 | x_temp[-1] = x_temp[0]
246 | y_temp[-1] = y_temp[0]
247 | temp_cont = np.hstack((np.reshape(np.array(x_temp),[len(x_temp),1]),
248 | np.reshape(np.array(y_temp),[len(y_temp),1])))
249 | tck, u = splprep(temp_cont.T, u=None, s=13, per=1)
250 | u_new = np.linspace(u.min(), u.max(), len(cont))
251 | x_spline, y_spline = splev(u_new, tck, der=0)
252 | points = np.hstack((np.reshape(np.array(x_temp),[len(x_temp),1]),
253 | np.reshape(np.array(y_temp),[len(y_temp),1])))
254 | cummul_paths[c] = Path(points,closed=True)
255 |
256 | # false to exit while loop
257 | cont_loop = False
258 | except:
259 | overlap = False
260 |
261 |
262 |
263 | if overlap == False:
264 | # check if points are inside contour (removed from mask)
265 | cummul_paths[c] = conts_paths[c]
266 | c += 1
267 | if c == dummy_cont_count: cont_loop = False
268 |
269 |
270 | # adjust contours, ignoring the overlap to get true vertical etch
271 | # stack all contours and convert to binary mask
272 | for c in conts_paths:
273 | print(' solving VERT etch in contour: ', c)
274 | new_cont_points = horiz_etch(conts_paths[c].vertices,horiz_rate,
275 | t_step,norm_span,window_len)
276 |
277 | conts_paths[c] = Path(new_cont_points,closed=True)
278 | conts_polys[c] = Polygon(new_cont_points)
279 |
280 | inside = conts_paths[c].contains_points(grid_point_pairs)
281 | z_mask += inside.astype(int).reshape(xv.shape)
282 | z_mask[z_mask>0] = 1
283 | # update topography using the z_mask
284 | z_step = z_mask * (vert_rate*t_step)
285 | # etch back; try to reference last time step, except its the first time step
286 | try:
287 | topo[i_t] = topo[i_t-1] - z_step
288 | except:
289 | topo[i_t] -= z_step
290 |
291 |
292 | # # dummy plot
293 | ax2.plot(t,vert_rate/horiz_rate,'k')
294 | # if i_t>400 and c==3:
295 | # patch = patches.PathPatch(conts_paths[c], fill=False, lw=2)
296 | # ax2.add_patch(patch)
297 | # contourplot = plt.contourf(x, y, z_mask, 100, cmap=cmap,vmin=0, vmax=2)
298 | # ax2.autoscale()
299 | # ax2, _ = mpl.colorbar.make_axes(plt.gca())
300 | # cbar = mpl.colorbar.ColorbarBase(ax2, cmap=cmap,
301 | # norm=mpl.colors.Normalize(vmin=0, vmax=2),
302 | # label=' etch depth [um]')
303 | # cbar.set_clim(0, 2)
304 |
305 |
306 |
307 | fig_2d_cont, ax1_2d_cont = plt.subplots(figsize=(13,12))
308 |
309 | # now solve the horizontal step in the combined contour
310 | for c in cummul_paths:
311 | print(' solving HORIZ etch in cummulative contour: ', c)
312 |
313 | curr_path = cummul_paths[c]
314 | try:
315 | updated_cont = horiz_etch(curr_path.vertices,horiz_rate,
316 | t_step,norm_span,window_len)
317 | except:
318 | pass
319 | dummy_cont_path = Path(updated_cont,closed=True)
320 | patch = patches.PathPatch(dummy_cont_path, fill=False, lw=2)
321 | ax1_2d_cont.add_patch(patch)
322 | ax1_2d_cont.plot(dummy_cont_path.vertices[:,0],
323 | dummy_cont_path.vertices[:,1],'k')
324 |
325 |
326 | # plot 2d contour
327 | contourplot = plt.contourf(x, y, topo[i_t], 500, cmap=cmap,vmin=vmin, vmax=vmax)
328 | title_str = 't = %s s' % str(t)
329 | plt.title(title_str)
330 | ax1_2d_cont, _ = mpl.colorbar.make_axes(plt.gca())
331 | cbar = mpl.colorbar.ColorbarBase(ax1_2d_cont, cmap=cmap,
332 | norm=mpl.colors.Normalize(vmin=vmin, vmax=vmax),
333 | label=' etch depth [um]')
334 | cbar.set_clim(vmin, vmax)
335 | ax1_2d_cont.autoscale()
336 | out_fig = 'C:/Users/nicas/Documents/E241-MicroNanoFab/codes/comb_contours/' + \
337 | str(t) + '.png'
338 | plt.savefig(out_fig, bbox_inches='tight')
339 | plt.close()
340 |
341 |
342 |
343 | # plot 3d surface
344 | fig_3d_surf = plt.figure(figsize=(24,10))
345 | # ax2_3d_surf = fig_3d_surf.gca(projection='3d')
346 | ax2_3d_surf = fig_3d_surf.add_subplot(111, projection='3d')
347 |
348 | # fig_3d_surf = plt.figure(figsize=(20,11))
349 | # ax2_3d_surf = fig_3d_surf.add_subplot(1,2,1,projection='3d')
350 | surf = ax2_3d_surf.plot_surface(x, y, topo[i_t-1], rstride=rstride,
351 | cstride=cstride,
352 | cmap=cmap,vmin=vmin, vmax=vmax,
353 | linewidth=0, antialiased=False)
354 | ax2_3d_surf.set_zlim(vmin, vmax)
355 | ax2_3d_surf.view_init(65, -60)
356 | title_str = 't = %s s' % str(t)
357 | plt.title(title_str)
358 | # add a color bar which maps values to colors.
359 | ax2_3d_surf, _ = mpl.colorbar.make_axes(plt.gca())
360 | cbar = mpl.colorbar.ColorbarBase(ax2_3d_surf, cmap=cmap,
361 | norm=mpl.colors.Normalize(vmin=vmin, vmax=vmax),
362 | label=' etch depth [um]')
363 | cbar.set_clim(vmin, vmax)
364 | ax2_3d_surf.autoscale()
365 | # fig_3d_surf.colorbar(surf, shrink=0.5, aspect=5)
366 | out_fig = 'C:/Users/nicas/Documents/E241-MicroNanoFab/codes/comb_contours_3d/' + \
367 | str(t) + '.png'
368 | plt.savefig(out_fig, bbox_inches='tight')
369 | plt.close()
370 |
371 |
372 |
373 | fig3, ax3 = plt.subplots(figsize=(8,7))
374 | for i,_ in enumerate(x):
375 | line_x = np.sqrt(x[i,i]**2 + y[i,i]**2)
376 | ax3.scatter(line_x,topo[i_t-1][i,i])
377 | # ax2.set_title(title_str)
378 |
379 | # ax2.add_patch(patch_dummy)
380 | ax3.autoscale()
381 | plt.show()
382 |
383 |
384 | fig3, ax3 = plt.subplots(figsize=(8,7))
385 | dummy_i = int(n_points/2)
386 | plt.plot(x[dummy_i,:],topo[i_t-1][dummy_i,:])
387 | # ax2.set_title(title_str)
388 |
389 | # ax2.add_patch(patch_dummy)
390 | plt.show()
391 |
392 |
--------------------------------------------------------------------------------
/etch_model_version1.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib as mpl
3 | import matplotlib.pyplot as plt
4 | import matplotlib.patches as patches
5 | import cv2
6 | from scipy.spatial import distance as dist
7 | from matplotlib.path import Path
8 | from shapely.geometry.polygon import Polygon
9 | from shapely.geometry import Point
10 | from shapely.ops import cascaded_union
11 | from mpl_toolkits.mplot3d import Axes3D
12 | from scipy.interpolate import splprep, splev
13 | mpl.style.use('default')
14 |
15 |
16 |
17 |
18 | """
19 | Written by Nicolas Castano
20 |
21 | Model continuous etching into silicon wafer on the PT-DSE tool in the SNF
22 | based known etch rates.
23 | """
24 |
25 | def horiz_etch(cont,horiz_rate,t_step,norm_span,sm_window):
26 | # method to apply normal step to a contour
27 | out_cont = np.zeros_like(cont)
28 | for p, point in enumerate(cont):
29 | # x_q = point[0] # if you want to plot a quiver for normal lines
30 | # y_q = point[1]
31 | # calculate normal to point
32 | # enable looping index to beginning
33 | if p + norm_span > len(cont)-1:
34 | dummy_p = 0
35 | else:
36 | dummy_p = p
37 | p_1 = cont[dummy_p-norm_span]
38 | p_2 = cont[dummy_p+norm_span]
39 | tan_vec = np.array([[p_2[0]-p_1[0]],
40 | [p_2[1]-p_1[1]]])
41 | norm_vec = np.matmul(rot90_mat,tan_vec)
42 | unit_norm = norm_vec/np.linalg.norm(norm_vec)
43 |
44 | # if t==30:
45 | # ax.quiver(x_q,y_q,unit_norm[0],unit_norm[1],width=0.002)
46 | # ax.plot(cont[:,0],cont[:,1],'o')
47 |
48 | # calculate new point
49 | new_pt = point + horiz_rate*t_step*np.reshape(unit_norm,(1,2))
50 | out_cont[p,:] = new_pt
51 |
52 | # fprce last point to be on top of first in contour
53 | out_cont[-1,0] = out_cont[0,0]
54 | out_cont[-1,1] = out_cont[0,1]
55 | # smooth with spline
56 | tck, u = splprep(out_cont.T, u=None, s=0, per=1)
57 | u_new = np.linspace(u.min(), u.max(), len(cont))
58 | x_spline, y_spline = splev(u_new, tck, der=0)
59 | out_cont = np.hstack((np.reshape(np.array(x_spline),[len(x_spline),1]),
60 | np.reshape(np.array(y_spline),[len(y_spline),1])))
61 |
62 | return out_cont
63 |
64 | #def animate(index):
65 | # zi = ml.griddata(x, y, zlist[index], xi, yi, interp='linear')
66 | # ax.clear()
67 | # ax.contourf(xi, yi, zi, **kw)
68 | # ax.set_title('%03d'%(index))
69 |
70 |
71 |
72 |
73 | C4F8 = 100 # sccm
74 | SF6 = 300 # sccm
75 | bias = 10 # volts
76 | time = 600 # seconds
77 | opening = 100 # um
78 |
79 | plt.close('all')
80 |
81 | # load mask
82 | im_dir = 'C:/Users/nicas/Documents/E241-MicroNanoFab/masks/'
83 | im_file = 'python_model_fil_sq.png'
84 | im_path = im_dir + im_file
85 | curr_im = cv2.imread(im_path, cv2.IMREAD_ANYDEPTH)
86 | curr_im = cv2.GaussianBlur(curr_im,(3,3),0)
87 |
88 |
89 | rgb_im = cv2.cvtColor(curr_im, cv2.COLOR_GRAY2RGB)
90 |
91 | cont_im, conts, hier = cv2.findContours(curr_im, cv2.RETR_LIST, cv2.CHAIN_APPROX_NONE)
92 | conts_im = cv2.drawContours(rgb_im, conts, -1, (0,255,0),3)
93 |
94 |
95 | dummy_i = im_file.find('.png')
96 | out_file = im_dir + im_file[:dummy_i] + '_out' + im_file[dummy_i:]
97 | cv2.imwrite(out_file, conts_im)
98 |
99 |
100 | t_start = 0
101 | t_end = 600 # seconds
102 | t_step = 5
103 | h = curr_im.shape[0]
104 | w = curr_im.shape[1]
105 | n_points = 600
106 | contour_read_step = 5
107 | topo_im = np.zeros_like(curr_im)
108 | norm_span = 3
109 | window_len = 17
110 | rot90_mat = np.array([[np.cos(np.pi/2), -np.sin(np.pi/2)],
111 | [np.sin(np.pi/2), np.cos(np.pi/2)]])
112 | vert_rate = 287/600 # um/s
113 |
114 | horiz_rate = 77/600 # um/s
115 | pixel_um_conv = 251/90.4672 # px/um
116 | cmap = 'gnuplot' # 'inferno' 'viridis' # 'hot'
117 | vmin = -290 # expected range of depth for color bar (min)
118 | vmax = 0
119 | # for plotting srface plot
120 | rstride = 2
121 | cstride = 2
122 |
123 |
124 | x_axis = np.linspace(0,w/pixel_um_conv,n_points)
125 | y_axis = np.linspace(0,h/pixel_um_conv,n_points)
126 | xv,yv = np.meshgrid(x_axis,y_axis)
127 | x_points = np.ravel(xv)
128 | x_points = x_points.reshape((len(x_points),1))
129 | y_points = np.ravel(yv)
130 | y_points = y_points.reshape((len(y_points),1))
131 | grid_point_pairs = np.hstack((x_points,y_points))
132 |
133 |
134 |
135 | # get points for each contour
136 | # tracking paths and polygons
137 | conts_paths = {}
138 | conts_polys = {}
139 | topo_data = {}
140 | unit_norm_vectors = {}
141 |
142 |
143 | #fig, ax = plt.subplots()
144 | for c, cont in enumerate(conts):
145 | x = []
146 | y = []
147 | for p, point in enumerate(cont):
148 | if p%contour_read_step == 0:
149 | x.append(point[0][0]/pixel_um_conv)
150 | y.append(point[0][1]/pixel_um_conv)
151 | # plt.scatter(point[0][0],point[0][1],3,'k')
152 | # plt.text(point[0][0],point[0][1],str(p))
153 |
154 | # force last point to be on top of first point
155 | x[-1] = x[0]
156 | y[-1] = y[0]
157 | # smooth contour with spline
158 | points = np.hstack((np.reshape(np.array(x),[len(x),1]),
159 | np.reshape(np.array(y),[len(y),1])))
160 |
161 | tck, u = splprep(points.T, u=None, s=0.0, per=1)
162 | u_new = np.linspace(u.min(), u.max(), len(cont))
163 | x_spline, y_spline = splev(u_new, tck, der=0)
164 |
165 | points = np.hstack((np.reshape(np.array(x_spline),[len(x_spline),1]),
166 | np.reshape(np.array(y_spline),[len(y_spline),1])))
167 |
168 |
169 | temp_poly = Polygon(points)
170 | temp_path = Path(temp_poly.exterior,closed=True)
171 | conts_paths[c] = temp_path # path object nice for the contains_point attribute
172 | conts_polys[c] = temp_poly
173 |
174 | unit_norm_vectors[c] = np.zeros_like(conts_paths[c].vertices)
175 | topo_data[c] = np.zeros((grid_point_pairs.shape[0],1)) # each point will have a depth
176 |
177 |
178 | # patch = patches.PathPatch(temp_path, facecolor='orange', lw=2)
179 | # ax.add_patch(patch)
180 | #ax.autoscale_view()
181 | #plt.show()
182 |
183 | x = grid_point_pairs[:,0].reshape(xv.shape)
184 | y = grid_point_pairs[:,1].reshape(yv.shape)
185 |
186 | fig2, ax2 = plt.subplots(figsize=(8,7))
187 |
188 | dummy_cont_count = len(conts_paths)
189 |
190 | topo = []
191 |
192 | # solve the etching of the mask
193 | for i_t,t in enumerate(range(t_start, t_end, t_step)):
194 | # vert_rate = (10+2/600*t)/60 # (10 + 0.0000056969697*t**2)/60
195 | # horiz_rate = (3-(-0.1318335/0.04394449)*(1-np.exp(-0.04394449*t)))/60
196 | print('solving time: ', t)
197 | z_mask = np.zeros_like(xv) # 1 if etch back at node, 0 if not
198 | topo.append(np.zeros_like(xv))
199 | cont_loop = True
200 | overlap = False
201 | cummul_paths = {}
202 | c = 0
203 |
204 | # determine the overlapping points in adjacents mask openings
205 | # 0 for no mask opening, 1 for opening
206 | # solving for surface level contour
207 | while cont_loop == True:
208 | # for c in cont_arrays:
209 | print(' checking for OVERLAP with contour: ', c)
210 | # determine if contours overlap
211 | other_conts = list(range(dummy_cont_count))
212 | other_conts.pop(other_conts.index(c))
213 | for oc in other_conts:
214 | for pt in conts_paths[oc].vertices:
215 | if conts_paths[c].contains_point(pt):
216 | overlap = True
217 | break
218 | if overlap == True: break
219 |
220 |
221 | # # stack all contours and convert to binary mask
222 | #
223 | # inside = conts_paths[c].contains_points(grid_point_pairs)
224 | # z_mask += inside.astype(int).reshape(xv.shape)
225 | # z_mask[z_mask>0] = 1
226 |
227 |
228 |
229 |
230 |
231 |
232 |
233 |
234 | # if one overlaps assume all are overlapping
235 | if overlap == True:
236 | print(' overlap detected')
237 | #combine contours
238 | polys = [conts_polys[poly] for poly in list(conts_polys.keys())]
239 | new_cont = cascaded_union([poly if poly.is_valid
240 | else poly.buffer(0) for poly in polys])
241 |
242 | # smooth with spline
243 | try:
244 | x_temp,y_temp = new_cont.exterior.xy
245 | x_temp[-1] = x_temp[0]
246 | y_temp[-1] = y_temp[0]
247 | temp_cont = np.hstack((np.reshape(np.array(x_temp),[len(x_temp),1]),
248 | np.reshape(np.array(y_temp),[len(y_temp),1])))
249 | tck, u = splprep(temp_cont.T, u=None, s=13, per=1)
250 | u_new = np.linspace(u.min(), u.max(), len(cont))
251 | x_spline, y_spline = splev(u_new, tck, der=0)
252 | points = np.hstack((np.reshape(np.array(x_temp),[len(x_temp),1]),
253 | np.reshape(np.array(y_temp),[len(y_temp),1])))
254 | cummul_paths[c] = Path(points,closed=True)
255 |
256 | # false to exit while loop
257 | cont_loop = False
258 | except:
259 | overlap = False
260 |
261 |
262 |
263 | if overlap == False:
264 | # check if points are inside contour (removed from mask)
265 | cummul_paths[c] = conts_paths[c]
266 | c += 1
267 | if c == dummy_cont_count: cont_loop = False
268 |
269 |
270 | # adjust contours, ignoring the overlap to get true vertical etch
271 | # stack all contours and convert to binary mask
272 | for c in conts_paths:
273 | print(' solving VERT etch in contour: ', c)
274 | new_cont_points = horiz_etch(conts_paths[c].vertices,horiz_rate,
275 | t_step,norm_span,window_len)
276 |
277 | conts_paths[c] = Path(new_cont_points,closed=True)
278 | conts_polys[c] = Polygon(new_cont_points)
279 |
280 | inside = conts_paths[c].contains_points(grid_point_pairs)
281 | z_mask += inside.astype(int).reshape(xv.shape)
282 | z_mask[z_mask>0] = 1
283 | # update topography using the z_mask
284 | z_step = z_mask * (vert_rate*t_step)
285 | # etch back; try to reference last time step, except its the first time step
286 | try:
287 | topo[i_t] = topo[i_t-1] - z_step
288 | except:
289 | topo[i_t] -= z_step
290 |
291 |
292 | # # dummy plot
293 | ax2.plot(t,vert_rate/horiz_rate,'k')
294 | # if i_t>400 and c==3:
295 | # patch = patches.PathPatch(conts_paths[c], fill=False, lw=2)
296 | # ax2.add_patch(patch)
297 | # contourplot = plt.contourf(x, y, z_mask, 100, cmap=cmap,vmin=0, vmax=2)
298 | # ax2.autoscale()
299 | # ax2, _ = mpl.colorbar.make_axes(plt.gca())
300 | # cbar = mpl.colorbar.ColorbarBase(ax2, cmap=cmap,
301 | # norm=mpl.colors.Normalize(vmin=0, vmax=2),
302 | # label=' etch depth [um]')
303 | # cbar.set_clim(0, 2)
304 |
305 |
306 |
307 | fig_2d_cont, ax1_2d_cont = plt.subplots(figsize=(13,12))
308 |
309 | # now solve the horizontal step in the combined contour
310 | for c in cummul_paths:
311 | print(' solving HORIZ etch in cummulative contour: ', c)
312 |
313 | curr_path = cummul_paths[c]
314 | try:
315 | updated_cont = horiz_etch(curr_path.vertices,horiz_rate,
316 | t_step,norm_span,window_len)
317 | except:
318 | pass
319 | dummy_cont_path = Path(updated_cont,closed=True)
320 | patch = patches.PathPatch(dummy_cont_path, fill=False, lw=2)
321 | ax1_2d_cont.add_patch(patch)
322 | ax1_2d_cont.plot(dummy_cont_path.vertices[:,0],
323 | dummy_cont_path.vertices[:,1],'k')
324 |
325 |
326 | # plot 2d contour
327 | contourplot = plt.contourf(x, y, topo[i_t], 500, cmap=cmap,vmin=vmin, vmax=vmax)
328 | title_str = 't = %s s' % str(t)
329 | plt.title(title_str)
330 | ax1_2d_cont, _ = mpl.colorbar.make_axes(plt.gca())
331 | cbar = mpl.colorbar.ColorbarBase(ax1_2d_cont, cmap=cmap,
332 | norm=mpl.colors.Normalize(vmin=vmin, vmax=vmax),
333 | label=' etch depth [um]')
334 | cbar.set_clim(vmin, vmax)
335 | ax1_2d_cont.autoscale()
336 | out_fig = 'C:/Users/nicas/Documents/E241-MicroNanoFab/codes/comb_contours/' + \
337 | str(t) + '.png'
338 | plt.savefig(out_fig, bbox_inches='tight')
339 | plt.close()
340 |
341 |
342 |
343 | # plot 3d surface
344 | fig_3d_surf = plt.figure(figsize=(24,10))
345 | # ax2_3d_surf = fig_3d_surf.gca(projection='3d')
346 | ax2_3d_surf = fig_3d_surf.add_subplot(111, projection='3d')
347 |
348 | # fig_3d_surf = plt.figure(figsize=(20,11))
349 | # ax2_3d_surf = fig_3d_surf.add_subplot(1,2,1,projection='3d')
350 | surf = ax2_3d_surf.plot_surface(x, y, topo[i_t-1], rstride=rstride,
351 | cstride=cstride,
352 | cmap=cmap,vmin=vmin, vmax=vmax,
353 | linewidth=0, antialiased=False)
354 | ax2_3d_surf.set_zlim(vmin, vmax)
355 | ax2_3d_surf.view_init(65, -60)
356 | title_str = 't = %s s' % str(t)
357 | plt.title(title_str)
358 | # add a color bar which maps values to colors.
359 | ax2_3d_surf, _ = mpl.colorbar.make_axes(plt.gca())
360 | cbar = mpl.colorbar.ColorbarBase(ax2_3d_surf, cmap=cmap,
361 | norm=mpl.colors.Normalize(vmin=vmin, vmax=vmax),
362 | label=' etch depth [um]')
363 | cbar.set_clim(vmin, vmax)
364 | ax2_3d_surf.autoscale()
365 | # fig_3d_surf.colorbar(surf, shrink=0.5, aspect=5)
366 | out_fig = 'C:/Users/nicas/Documents/E241-MicroNanoFab/codes/comb_contours_3d/' + \
367 | str(t) + '.png'
368 | plt.savefig(out_fig, bbox_inches='tight')
369 | plt.close()
370 |
371 |
372 |
373 | fig3, ax3 = plt.subplots(figsize=(8,7))
374 | for i,_ in enumerate(x):
375 | line_x = np.sqrt(x[i,i]**2 + y[i,i]**2)
376 | ax3.scatter(line_x,topo[i_t-1][i,i])
377 | # ax2.set_title(title_str)
378 |
379 | # ax2.add_patch(patch_dummy)
380 | ax3.autoscale()
381 | plt.show()
382 |
383 |
384 | fig3, ax3 = plt.subplots(figsize=(8,7))
385 | dummy_i = int(n_points/2)
386 | plt.plot(x[dummy_i,:],topo[i_t-1][dummy_i,:])
387 | # ax2.set_title(title_str)
388 |
389 | # ax2.add_patch(patch_dummy)
390 | plt.show()
391 |
392 |
--------------------------------------------------------------------------------
/etch_model_version3_cellular_automata.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import matplotlib as mpl
3 | import matplotlib.cm as cmx
4 | import matplotlib.pyplot as plt
5 | import matplotlib.patches as patches
6 | import cv2
7 | import pyvista as pv
8 | import open3d as o3d
9 | import shapely.geometry as geometry
10 | import time
11 | #from skimage import measure
12 | from ast import literal_eval
13 | from copy import deepcopy
14 | from scipy.spatial import distance as dist
15 | from scipy.spatial import Delaunay
16 | from sklearn.neighbors import NearestNeighbors, KNeighborsRegressor
17 | from matplotlib.path import Path
18 | from shapely.geometry.polygon import Polygon
19 | from shapely.ops import cascaded_union, polygonize
20 | from mpl_toolkits.mplot3d import Axes3D
21 | from scipy.interpolate import splprep, splev
22 | from etch_sim_utilities import *
23 |
24 | nps = pv.vtk.util.numpy_support
25 |
26 | mpl.style.use('default')
27 |
28 | start_time = time.time()
29 |
30 | """
31 | Written by Nicolas Castano
32 |
33 | Model continuous etching into silicon wafer on the PT-DSE tool in the SNF
34 | based known etch rates.
35 |
36 | Data is stored in ordered dictionary with keys as specific step, kept in the
37 | order in which it was created:
38 | etch_grid = {'init': [pv.PolyData(mask_cont_0), pv.PolyData(mask_cont_1),
39 | pv.PolyData(mask_cont_2), ...],
40 | global_step_0: [pv.PolyData(mask_cont_0), ne
41 | pv.PolyData(mask_cont_1),
42 | pv.PolyData(mask_cont_2), ...],
43 | global_step_N: [pv.PolyData(mask_cont_0)]}
44 |
45 | """
46 |
47 |
48 | ###############################################################################
49 | ################################ USER INPUTS ##################################
50 | ###############################################################################
51 |
52 | # define recipe
53 | # ex: {'step1':{'bosch':13,'iso':100,'cylces':7},
54 | # 'step2':{'bosch':240,'iso':None,'cycles':240},
55 | # 'step3':{'bosch':None,'iso':70,'cycles':1}}
56 | #recipe_steps = {'step1':{'bosch':13,'iso':100,'cycles':7},
57 | # 'step2':{'bosch':240,'iso':None,'cycles':240},
58 | # 'step3':{'bosch':None,'iso':70,'cycles':1}}
59 | #recipe_steps = {'step0':{'bosch':7,'iso':5,'cycles':2}}
60 | #recipe_steps = {'step01':{'bosch':15,'iso':100,'cycles':7},
61 | ## 'step02':{'bosch':300,'iso':None,'cycles':300},
62 | # 'step03':{'bosch':None,'iso':100,'cycles':1}}
63 | recipe_steps = {'step01':{'bosch':15,'iso':100,'cycles':4},
64 | # 'step02':{'bosch':300,'iso':None,'cycles':300},
65 | 'step03':{'bosch':None,'iso':800,'cycles':1}}
66 |
67 |
68 | # load mask
69 | im_dir = 'C:/Users/nicas/Documents/E241-MicroNanoFab/masks/'
70 | im_file = 'python_model_fil_sq.png'
71 | pixel_um_conv = 276/100 # px/um
72 | gap = 249/pixel_um_conv
73 | # 151.37/100 # for R2_C2
74 | # for R5_C3: 49.291/80.4384 # px/um
75 |
76 | # read in mask image and define contour
77 | im_path = im_dir + im_file
78 | curr_im = cv2.imread(im_path, cv2.IMREAD_ANYDEPTH)
79 | curr_im = cv2.GaussianBlur(curr_im,(3,3),0)
80 | rgb_im = cv2.cvtColor(curr_im, cv2.COLOR_GRAY2RGB)
81 | cont_im, conts, hier = cv2.findContours(curr_im, cv2.RETR_LIST,
82 | cv2.CHAIN_APPROX_NONE)
83 | conts_im = cv2.drawContours(rgb_im, conts, -1, (0,255,0),3)
84 | # show the contour to verify
85 | dummy_i = im_file.find('.png')
86 | out_file = im_dir + im_file[:dummy_i] + '_out' + im_file[dummy_i:]
87 | cv2.imwrite(out_file, conts_im)
88 |
89 | cell_size = 8 # microns
90 | wafer_thickness = 500 # microns
91 |
92 | t_start = 0 # seconds
93 | t_step = 5
94 |
95 | h = curr_im.shape[0]
96 | w = curr_im.shape[1]
97 |
98 | contour_read_step = 5
99 | topo_im = np.zeros_like(curr_im)
100 | norm_span = 7 # span of data points taken for computing normals
101 | window_len = 17 # for smoothing of mask contour read
102 | horiz_to_vert_rate_ratio = 0.6
103 | def vert_rate(z):
104 | a = 0.141
105 | b = 0.0007
106 | # return a*np.exp(-b*z)
107 | return (0.14-0.02) + z*(0.03/500)
108 | # d = a*np.exp(-b*z)
109 | # return d + d/2* np.cos(z * 2*np.pi/10)
110 | def horiz_rate(z):
111 | r = (0.65-0.1)+z*(0.1/500)
112 | return r*vert_rate(z)
113 | # return 0.6 + 0.2* np.cos(z * 2*np.pi/10)
114 |
115 | #vert_rate = 8.5/60 # um/s
116 | def bosch_vert_step(z):
117 | return (0.84-0.1) + 0.1/500*z
118 | # return 0.84 + 0.4* np.cos(z * 2*np.pi/10)
119 |
120 | #bosch_vert_step = 0.84 # um/step
121 |
122 | #horiz_rate = 0.09# vert_rate*0.6#90/600 # vert_rate*0.6 # um/s
123 |
124 | # advanced settings
125 | set_res = 3000 # resolution of vtk plane (mesh density)
126 | cmap = 'gnuplot' # 'inferno' 'viridis' # 'hot'
127 | vmin = -290 # expected range of depth for color bar (min)
128 | vmax = 0
129 | # for plotting srface plot
130 | rstride = 2
131 | cstride = 2
132 |
133 |
134 | ###############################################################################
135 | ###############################################################################
136 | ###############################################################################
137 |
138 |
139 | # initialize global topo data container following data structure
140 | # indicated in the script header
141 | # construct global data container; this is a ordered dictionary so later we
142 | # can loop over keys and ensure that
143 | etch_grid = define_steps(recipe_steps, t_start, t_step)
144 | n_steps = len(list(etch_grid.keys()))
145 |
146 |
147 | # construct mask paths and check is cell centers are within masks
148 | # path objects used for determining if point is within mask
149 | mask_paths = {}
150 | # build initial geometries from mask that will be tracked through solution
151 | print('building initial features')
152 | x_min, x_max, y_min, y_max = 10**8, -10**8, 10**8, -10**8
153 | for c, cont in enumerate(conts):
154 | x = []
155 | y = []
156 | # gather points in mask contours
157 | for p, point in enumerate(cont):
158 | if p%contour_read_step == 0:
159 | # translate point so mask centered at 0,0
160 | temp_x = point[0][0]/pixel_um_conv - (w/pixel_um_conv)/2
161 | temp_y = point[0][1]/pixel_um_conv - (h/pixel_um_conv)/2
162 | x.append(temp_x)
163 | y.append(temp_y)
164 | if temp_x < x_min: x_min = round(temp_x,3)
165 | if temp_y < y_min: y_min = round(temp_y,3)
166 | if temp_x > x_max: x_max = round(temp_x,3)
167 | if temp_y > y_max: y_max = round(temp_y,3)
168 |
169 | # force last point to be on top of first point to close the polygon
170 | # remove redundant points
171 | x[-1] = x[0]
172 | y[-1] = y[0]
173 | # smooth contour with spline
174 | points = np.hstack((np.reshape(np.array(x),[len(x),1]),
175 | np.reshape(np.array(y),[len(y),1])))
176 | tck, u = splprep(points.T, u=None, s=0.0, per=1)
177 | u_new = np.linspace(u.min(), u.max(), len(cont))
178 | x_spline, y_spline = splev(u_new, tck, der=0)
179 | points = np.hstack((np.reshape(np.array(x_spline),[len(x_spline),1]),
180 | np.reshape(np.array(y_spline),[len(y_spline),1])))
181 | # make polygon objects
182 | mask_poly = Polygon(points)
183 | # make path objects (has nice contains_points method)
184 | mask_paths[c] = Path(mask_poly.exterior,closed=True)
185 | # this just means theres no buffer region around the feature
186 | buff_mask = mask_paths[c]
187 |
188 |
189 |
190 | # initialize the starting grid that will be etched away using cellular
191 | # automata method
192 | x_axis = np.arange(x_min-cell_size,x_max+cell_size,cell_size)
193 | y_axis = np.arange(y_min-cell_size,y_max+cell_size,cell_size)
194 | z_axis = np.array([wafer_thickness-cell_size,
195 | wafer_thickness])
196 |
197 | x_nodes, y_nodes, z_nodes = np.meshgrid(x_axis, y_axis, z_axis)
198 | init_grid = pv.StructuredGrid(x_nodes, y_nodes, z_nodes)
199 | init_grid = init_grid.cast_to_unstructured_grid()
200 |
201 | start_time = time.time()
202 | # initialize the cell dictionaries for 'exposed_cells'
203 | # key: tuple of cell center coord
204 | # value: {'state': 0 to 1, 'in_mask': True or False,
205 | # 'neighbors':[(coord neigh 0),(coord neigh 1),...]}
206 | # and for 'neighbor_cells'
207 | # this is a set that contains tuples of coords of cells that are added to
208 | # be neighbors of exposed cells
209 | exposed_cells = {}
210 | neighbor_cells = set()
211 | removed_cells = set()
212 | known_in_mask_coords = set()
213 | # get cell centers for finding cell that are in the mask
214 | cell_centers = np.around(np.array(init_grid.cell_centers().points),3)
215 | n_cells= cell_centers.shape[0]
216 | for i_cell, pt in enumerate(cell_centers):
217 | if i_cell % int(n_cells/10) == 0:
218 | print(' finding exposed and neighbor cells: %i of %i' % \
219 | (i_cell,n_cells-1))
220 | temp_cell = init_grid.GetCell(i_cell)
221 | in_mask = is_in_mask(pt[0],pt[1],mask_paths)
222 | if in_mask == True:
223 | temp_tuple = tuple(pt[0:2])
224 | # had to use str here for this to work
225 | known_in_mask_coords.add(str(temp_tuple))
226 |
227 | cell_tuple = tuple(pt)
228 | exposed_cells[cell_tuple] = {'state':1, 'in_mask':in_mask,
229 | 'neighbors':[],'normal':[],
230 | 'need_norm':True}
231 | neighs = compute_neigh_centers(pt,cell_size)
232 | for neigh in neighs:
233 | # store neighbor if its not a surface cell or its outside of paths
234 | if ((neigh[2] != (wafer_thickness - cell_size/2)) or \
235 | (neigh[2] == (wafer_thickness - cell_size/2) and \
236 | is_in_mask(neigh[0], neigh[1], mask_paths) == False)):
237 | neigh_tuple = tuple(neigh)
238 | exposed_cells[cell_tuple]['neighbors'].append(neigh)
239 | # only store neighbors on lower layer or if neighbor outside
240 | # of mask
241 | neighbor_cells.add(neigh_tuple)
242 | else:
243 | removed_cells.add(str(cell_tuple))
244 |
245 | # compute neighbors from neeighbors
246 | for cell_tuple in exposed_cells:
247 | exposed_cells[cell_tuple]['normal'] = np.array([0,0,1])#normal_from_neighbors(cell_tuple,
248 | # removed_cells,
249 | # cell_size=cell_size)
250 |
251 |
252 | ## extract all exposed_cells and neighbor_cells to the same unstructured grid
253 | #init_grid = make_grid([exposed_cells,neighbor_cells], cell_size)
254 |
255 |
256 | # define the grid for the initial step
257 | print('making init grid')
258 | etch_grid['init'] = make_cloud((exposed_cells, neighbor_cells))
259 |
260 |
261 | print('-------- %.2f seconds ---------' % (time.time()-start_time))
262 |
263 |
264 | step_index_lookup = {i:key for i,key in enumerate(etch_grid)}
265 | print('')
266 | # loop over etch_grid keys (after init) which represent each detailed step
267 | loop_steps = [key for key in list(etch_grid.keys()) if 'init' not in key]
268 | curr_process = 'init'
269 | extract_flag = False
270 | d = cell_size
271 | center_to_surface = wafer_thickness - cell_size/2
272 | diag_dist = round(np.sqrt(2*cell_size**2),3)
273 |
274 | fig = plt.figure()
275 | ax = fig.add_subplot(111, projection='3d')
276 | #ax = plt.subplot(111,projection='polar')
277 | plot_flag = True
278 | #
279 | #dummy_steps = loop_steps[loop_steps.index(step)-1:]
280 | for step_i, step in enumerate(loop_steps,start=1):
281 | #for step_i, step in enumerate(dummy_steps,start=loop_steps.index(step)):
282 |
283 | #plot some results
284 | if step_i % (int(len(loop_steps)/10)) == 0:
285 | print('making plot or %s' % step)
286 | neigh_pts,neigh_states,_ = make_cloud([neighbor_cells])
287 | neigh_obj = pv.PolyData(neigh_pts)
288 | plotter = pv.BackgroundPlotter()
289 | plotter.isometric_view()
290 | plotter.add_mesh(neigh_obj, show_edges=True,
291 | scalars=neigh_obj.points[:,2],
292 | point_size=8,
293 | render_points_as_spheres=True,
294 | cmap='inferno',
295 | clim=[min(neigh_obj.points[:,2]),
296 | wafer_thickness])
297 | plotter.add_scalar_bar(title='z height',height=0.08,width=0.4,
298 | position_x=0.01,position_y=0.01)
299 | plotter.add_text(step)
300 |
301 |
302 | # if dummy_i != len(etch_grid)-1: step_i = dummy_i + 1
303 | master_step = step.split('_')[0]
304 |
305 | # get list of features from last step to etch in current step
306 | curr_grid = etch_grid[step_index_lookup[step_i-1]]
307 |
308 | # define current process for printing and switching
309 | prev_process = curr_process
310 | if 'bosch-iso' in step:
311 | if 'isotime0' in step:
312 | curr_process = 'bosch-iso: bosch'
313 | else:
314 | curr_process = 'bosch-iso: iso'
315 | else:
316 | if 'bosch' in step.split('_')[1]:
317 | curr_process = 'bosch'
318 | elif 'iso' in step.split('_')[1]:
319 | curr_process = 'iso'
320 | if curr_process != prev_process:
321 | print('current process: %s \t step %i of %i'
322 | %(step, step_i,n_steps))
323 |
324 | # if extract_flag was activated
325 | if extract_flag == True:
326 | extract_flag = False
327 | print('\t\tcollecting neighbors and deleting cells')
328 | del_cells = [] # for deleting later
329 | convert_neighbors = set()
330 | new_cell_neighbors = {}
331 |
332 | # flag cells to be deleted from exposed_cells and store removed cells
333 | for cell_center in exposed_cells:
334 | if exposed_cells[cell_center]['state'] < 0:
335 | del_cells.append(tuple(cell_center)) # collect cells to delete
336 | # store removed cells
337 | removed_cells.add(str(tuple(cell_center)))
338 | elif np.isnan(exposed_cells[cell_center]['state']):
339 | del_cells.append(tuple(cell_center))
340 | print('\t\tremoving nan cells')
341 |
342 | # collect neighbors to be converted to exposed cells
343 | for cell_center in exposed_cells:
344 | if exposed_cells[cell_center]['state'] < 0:
345 | for neigh_center in exposed_cells[cell_center]['neighbors']:
346 | if str(tuple(neigh_center)) not in removed_cells:
347 | convert_neighbors.add(tuple(neigh_center))
348 |
349 |
350 |
351 | # remove from exposed_cells
352 | for cell in del_cells: del exposed_cells[cell]
353 |
354 | print('\t\tconverting neighbor cells to exposed')
355 | # add converted neighbors to exposed_cells container
356 | for i_cell, new_cell in enumerate(list(convert_neighbors)):
357 | # collect neighbors for new cell
358 | temp_tuple = tuple([list(new_cell)[0], list(new_cell)[1]])
359 | coord_tuple = str(temp_tuple)
360 | if coord_tuple in known_in_mask_coords:
361 | label = True
362 | else:
363 | label = False#is_in_mask(x,y,mask_paths)
364 | exposed_cells[new_cell] = {'in_mask':label, 'state':1,
365 | 'neighbors':[],'normal':[],
366 | 'need_norm':True}
367 |
368 | # with only face neighbors, all neighbors of exposed cells
369 | # should be now exposed
370 | for convert_neigh in list(convert_neighbors):
371 | new_cell_neighbors[tuple(convert_neigh)] = []
372 | # determine new neighbors
373 | x, y, z = convert_neigh[0], convert_neigh[1], convert_neigh[2]
374 | new_cell_neigh_centers = \
375 | compute_neigh_centers(np.array(convert_neigh),cell_size)
376 | for neigh_center in new_cell_neigh_centers:
377 | temp_tuple = tuple(neigh_center)
378 | if (str(temp_tuple) not in removed_cells and \
379 | temp_tuple not in exposed_cells and \
380 | neigh_center[2] != center_to_surface):
381 |
382 | neighbor_cells.add(temp_tuple)
383 |
384 | new_cell_neighbors[tuple(convert_neigh)].append(neigh_center)
385 | # store neigh for each added cell in dict
386 | # add converted neighbors to exposed_cells container
387 | for i_cell, new_cell in enumerate(convert_neighbors):
388 | # collect neighbors for new cell
389 | neighs = [n for n in new_cell_neighbors[new_cell]]
390 | exposed_cells[tuple(new_cell)]['neighbors'] = neighs
391 | try:
392 | neighbor_cells.remove(new_cell)
393 | except:
394 | pass
395 | # compute normals for exposed cells
396 | exposed_cells = compute_normals_for_cells(exposed_cells,removed_cells,
397 | cell_size)
398 |
399 |
400 | if (curr_process == 'bosch-iso: bosch' or curr_process == 'bosch'):
401 | # bosch step is a vertical etch of exposed_cells in the x, y bounds
402 | # of the mask ('in_mask' == True)
403 | n_bosch_steps = recipe_steps[master_step]['bosch']
404 | if curr_process == 'bosch-iso: bosch':
405 | curr_bosch_step = int(step.split('_')[-2].split('bosch')[-1])
406 | else:
407 | curr_bosch_step = int(step.split('_')[-1].split('bosch')[-1])
408 |
409 |
410 | print('\tbosch step %i of %i' % \
411 | (curr_bosch_step, n_bosch_steps))
412 |
413 | for cell_center in exposed_cells:
414 | if exposed_cells[cell_center]['in_mask'] == True:
415 | exposed_cells[cell_center]['state'] -= bosch_vert_step(cell_center[2])/ \
416 | cell_size
417 | if (extract_flag == False and \
418 | exposed_cells[cell_center]['state'] < 0):
419 | extract_flag = True
420 |
421 |
422 | # update next step topo
423 | grid_plot = make_cloud((exposed_cells, neighbor_cells))
424 | etch_grid[step] = grid_plot
425 |
426 | elif (curr_process == 'bosch-iso: iso' or curr_process == 'iso'):
427 | # bosch step is a vertical etch of all exposed_cells
428 |
429 | n_iso_steps = recipe_steps[master_step]['iso']
430 | curr_iso_step = int(step.split('_')[-1].split('isotime')[-1])
431 |
432 | print('\tiso time %i of %i seconds' % \
433 | (curr_iso_step, n_iso_steps))
434 |
435 | # compute normals
436 | exp_pts = make_cloud([exposed_cells])[0]
437 | # exp_norms = compute_normals(exp_pts,use_nn=5,
438 | # ref_pt=np.array([np.mean(exp_pts[:,0]),
439 |
440 |
441 | angles = []
442 | amounts = []
443 | x_center, y_center, z_center = [],[],[]
444 | for i_cell, cell_center in enumerate(exposed_cells):
445 | angle = compute_angle(exposed_cells[cell_center]['normal'],
446 | ref_pt=np.array([np.mean(exp_pts[:,0]),
447 | np.mean(exp_pts[:,1]),
448 | np.mean(exp_pts[:,2])]))
449 | curr_vert_rate = vert_rate(cell_center[2])
450 | curr_horiz_rate = horiz_rate(cell_center[2])
451 | etch_amount = (curr_vert_rate*t_step)*np.cos(angle) + \
452 | (curr_horiz_rate*t_step)*np.sin(angle)
453 | # etch_amount = (horiz_rate*t_step)*np.cos(angle)
454 | exposed_cells[cell_center]['state'] -= etch_amount/cell_size
455 |
456 | if (extract_flag == False and \
457 | exposed_cells[cell_center]['state'] < 0):
458 | extract_flag = True
459 | if step_i > 30 and plot_flag == True:
460 | angles.append(angle)
461 | amounts.append(etch_amount)
462 | x_center.append(cell_center[0])
463 | y_center.append(cell_center[1])
464 | z_center.append(cell_center[2])
465 |
466 | if step_i > 30 and plot_flag == True:
467 | print('plotting!')
468 | colors_map = 'inferno'
469 | cm = plt.get_cmap(colors_map)
470 | c_norm = mpl.colors.Normalize(vmin=min(amounts), vmax=max(amounts))
471 | scalarMap = cmx.ScalarMappable(norm=c_norm, cmap=cm)
472 | scalarMap.set_array(amounts)
473 | fig.colorbar(scalarMap, shrink=0.5, aspect=5)
474 | ax.scatter(x_center, y_center, z_center,c=scalarMap.to_rgba(amounts))
475 | plot_flag = False
476 |
477 | # update next step topo
478 | grid_plot = make_cloud((exposed_cells, neighbor_cells))
479 | etch_grid[step] = grid_plot
480 |
481 | else:
482 | etch_grid[step] = make_cloud((exposed_cells, neighbor_cells))
483 | extract_flag = False
484 | pass
485 |
486 |
487 | print('-------- %.2f seconds ---------' % (time.time()-start_time))
488 |
489 |
490 | exposed_pts,exposed_states,_ = make_cloud([exposed_cells])
491 | neigh_pts,neigh_states,_ = make_cloud([neighbor_cells])
492 | plot_point_cloud((exposed_pts,neigh_pts),scalar='z')
493 |
494 | with_data = [g for g in etch_grid if len(etch_grid[g]) != 0]
495 | plot_png_dir = 'C:/Users/nicas/Documents/E241-MicroNanoFab/codes/' + \
496 | 'etch_model_version5_1/'
497 |
498 | #dict_file = plot_png_dir + 'exposed_cells_mask5_R5_C3.txt'
499 | ## save exposed_cell to file
500 | #with open(dict_file, 'w') as file:
501 | # file.write(json.dumps(exposed_cells))
502 | # save vtk file
503 | exposed_obj = pv.PolyData(exposed_pts)
504 | neigh_obj = pv.PolyData(neigh_pts)
505 | vtk_save_exp_obj = plot_png_dir + 'exposed_obj.vtk'
506 | vtk_save_neigh_obj = plot_png_dir + 'neigh_obj.vtk'
507 |
508 | exposed_obj.save(vtk_save_exp_obj)
509 | neigh_obj.save(vtk_save_neigh_obj)
510 |
511 | #pcd_exp = o3d.geometry.PointCloud()
512 | #pcd_neigh = o3d.geometry.PointCloud()
513 | #pcd_exp.points = o3d.utility.Vector3dVector(exposed_pts)
514 | #pcd_neigh.points = o3d.utility.Vector3dVector(neigh_pts)
515 | #o3d.visualization.draw_geometries([pcd_exp,pcd_neigh])
516 |
517 | #exp_norms = []
518 | #exp_pts = etch_grid['step01_bosch-iso02_bosch020_isotime5'][0]
519 | ##compute_normals_for_cells(exp_pts)
520 | #
521 | #for i,pt in enumerate(exp_pts):
522 | # if i%1000 == 0: print('%i of %i' %(i, len(exp_pts)))
523 | # try:
524 | # norm = exposed_cells[tuple(pt)]['normal']
525 | # except:
526 | # norm = normal_from_neighbors(tuple(pt),removed_cells,cell_size)
527 | # if len(norm) == 0: norm = normal_from_neighbors(tuple(pt),
528 | # removed_cells,cell_size)
529 | # exp_norms.append(norm)
530 | #pcd_exp = o3d.geometry.PointCloud()
531 | #pcd_exp.points = o3d.utility.Vector3dVector(exp_pts)
532 | #for norm in exp_norms:
533 | # pcd_exp.normals.append(norm)
534 | #o3d.visualization.draw_geometries([pcd_exp])
535 |
536 |
537 |
538 | select_data = with_data[0::int(len(with_data)/len(with_data))]
539 | for i_step,step in enumerate(loop_steps):
540 |
541 | if i_step % 5 == 0:
542 | print('writing plot .png file %i of %i' % (i_step+1,len(select_data)))
543 |
544 | pts = etch_grid[step][0]
545 | states = etch_grid[step][1]
546 | iden = etch_grid[step][2]
547 |
548 | exp_cell_idx = np.where(iden == 1)[0]
549 | neigh_cell_idx = np.where(iden == 0)[0]
550 |
551 | # unpack
552 | exp_cells,exp_states = pts[exp_cell_idx],states[exp_cell_idx]
553 | neigh_cells,neigh_states = pts[neigh_cell_idx],states[neigh_cell_idx]
554 |
555 | exposed_obj = pv.PolyData(exp_cells)
556 | neigh_obj = pv.PolyData(neigh_cells)
557 |
558 | plotter = pv.Plotter(off_screen=True)
559 | plotter.isometric_view()
560 |
561 | # plotter.add_mesh(neigh_obj, show_edges=True,
562 | # scalars=neigh_obj.points[:,2],
563 | # point_size=8,
564 | # render_points_as_spheres=True,
565 | # cmap='inferno',
566 | # clim=[143, 500])
567 | # plotter.add_scalar_bar(title='z height',height=0.08,width=0.4,
568 | # position_x=0.01,position_y=0.01)
569 | plotter.add_text(step)
570 |
571 | plotter.add_mesh(exposed_obj, show_edges=True,
572 | scalars=exposed_obj.points[:,2],
573 | point_size=8,
574 | render_points_as_spheres=True,
575 | cmap='rainbow',
576 | clim=[min(exposed_obj.points[:,2]), wafer_thickness])
577 | plotter.add_scalar_bar(title='z_height',height=0.08,width=0.4,
578 | position_x=0.01,position_y=0.1)
579 |
580 | file_name = plot_png_dir + step + '.png'
581 |
582 | plotter.screenshot(file_name,transparent_background=True)
583 |
584 |
585 | with_data = [g for g in etch_grid if len(etch_grid[g]) != 0]
586 | select_data = with_data[-2:-1:1]#int(len(with_data)/10)]
587 | for ind,step in enumerate(select_data):#range(20):
588 | pts = etch_grid[step][0]
589 | states = etch_grid[step][1]
590 | iden = etch_grid[step][2]
591 |
592 | exp_cell_idx = np.where(iden == 1)[0]
593 | neigh_cell_idx = np.where(iden == 0)[0]
594 |
595 | exp_cells,exp_states = pts[exp_cell_idx],states[exp_cell_idx]
596 | neigh_cells,neigh_states = pts[neigh_cell_idx],states[neigh_cell_idx]
597 |
598 | cells = pv.PolyData(exp_cells)
599 | neighs = pv.PolyData(neigh_cells)
600 |
601 | plotter = pv.BackgroundPlotter(title='exp_cells',
602 | window_size=[1024, 768])
603 |
604 | plotter.add_mesh(neighs, show_edges=True,
605 | scalars=neighs.points[:,2],
606 | point_size=10,
607 | render_points_as_spheres=True,
608 | cmap='inferno',
609 | clim=[min(neighs.points[:,2]), wafer_thickness])
610 | plotter.add_scalar_bar(title='z height',height=0.08,width=0.4,
611 | position_x=0.01,position_y=0.01)
612 |
613 | # plotter.add_mesh(cells, show_edges=True,
614 | # scalars=exp_states,#cells.points[:,2],
615 | # point_size=8,
616 | # render_points_as_spheres=True,
617 | # cmap='inferno',
618 | # clim=[0,1])#[min(cells.points[:,2]), wafer_thickness])
619 | # plotter.add_scalar_bar(title='z height',height=0.08,width=0.4,
620 | # position_x=0.01,position_y=0.1)
621 |
622 | plotter.add_text(step)
623 |
624 |
625 |
626 | ######
627 |
628 | pts = etch_grid['step03_iso01_isotime520']#loop_steps[-1]][0]
629 | states = etch_grid[loop_steps[-1]][1]
630 | iden = etch_grid[loop_steps[-1]][2]
631 |
632 | exp_cell_idx = np.where(iden == 1)[0]
633 |
634 | exp_cells,exp_states = pts[exp_cell_idx],states[exp_cell_idx]
635 |
636 |
637 |
638 | exposed_obj = pv.read(vtk_save_exp_obj)#pv.PolyData(exp_cells)
639 | smooth = exposed_obj.smooth(n_iter=100)
640 | plotter = pv.BackgroundPlotter(title=loop_steps[-1],
641 | window_size=[1024, 768])
642 | plotter.add_mesh(exposed_obj, show_edges=True,
643 | scalars=exposed_obj.points[:,2],
644 | point_size=8,
645 | render_points_as_spheres=True,
646 | cmap='inferno',
647 | clim=[min(exposed_obj.points[:,2]), wafer_thickness])
648 | plotter.add_scalar_bar(title='z_height',height=0.08,width=0.4,
649 | position_x=0.01,position_y=0.1)
650 |
651 |
652 | neigh_obj = pv.read(vtk_save_neigh_obj)#pv.PolyData(exp_cells)
653 | smooth = neigh_obj.smooth(n_iter=1000)
654 | plotter = pv.BackgroundPlotter(title=loop_steps[-1],
655 | window_size=[1024, 768])
656 | plotter.add_mesh(neigh_obj, show_edges=True,
657 | scalars=smooth.points[:,2],
658 | point_size=8,
659 | render_points_as_spheres=True,
660 | cmap='inferno',
661 | clim=[min(smooth.points[:,2]), wafer_thickness])
662 | plotter.add_scalar_bar(title='z_height',height=0.08,width=0.4,
663 | position_x=0.01,position_y=0.1)
664 |
665 | pts = etch_grid[loop_steps[-1]][0]
666 |
667 | pts = etch_grid['step03_iso01_isotime520'][0]#loop_steps[-1]][0]
668 | #cells = np.asarray(exposed_cells.points)
669 | obj = make_grid([pts],cell_size)
670 |
671 | #obj = pv.read('C:/Users/nicas/Documents/E241-MicroNanoFab/codes/etch_model_version5_1/exposed_obj.vtk')
672 | #obj = make_grid([np.array(obj.points)],cell_size)
673 |
674 | slices = obj.slice(normal=[1,1,0])
675 | plotter = pv.BackgroundPlotter(window_size=[1024, 768])
676 | plotter.add_mesh(obj, show_edges=False,
677 | scalars=obj.points[:,2],
678 | cmap='inferno',
679 | clim=[min(obj.points[:,2]),
680 | wafer_thickness])
681 |
682 | x,z = cross_section_slice(cells,cell_size,p1=(-200,-200),p2=(200,200))
683 | plt.plot(x,z)
684 |
--------------------------------------------------------------------------------
/etch_sim_utilities.py:
--------------------------------------------------------------------------------
1 | import numpy as np
2 | import pyvista as pv
3 | import matplotlib as mpl
4 | from sklearn.neighbors import NearestNeighbors, KNeighborsRegressor
5 |
6 | # helper function s for cellular automata silicon ethcing simulation
7 |
8 | def define_steps(recipe_steps, t_start, t_step):
9 | from collections import OrderedDict
10 | etch_grid = OrderedDict()
11 | etch_grid['init'] = []
12 | print('constructing specific step keys for topo container')
13 | for step in recipe_steps:
14 | for i_cycle, cycles in enumerate(range(recipe_steps[step]['cycles'])):
15 | # if it is a combined bosch-iso step
16 | if len(str(i_cycle)) < 2:
17 | if i_cycle == 9:
18 | i_cycle_str = str(i_cycle+1)
19 | else:
20 | i_cycle_str = '0' + str(i_cycle+1)
21 |
22 | if recipe_steps[step]['bosch'] != None and \
23 | recipe_steps[step]['iso'] != None:
24 | # construct detailed keys for data container
25 | # i.e. key step1_bosch-iso6_bosch12_isotime100 is data for
26 | # the 100th second if an iso etch following the 12th bosch step
27 | # in the 6th cycle of a bosch-iso combined 1st step of the recipe
28 | # combined bosch-iso etching starts with a bosch step; the key
29 | # for this first step can be identified by an "_isotime0" flag
30 | for i_bosch in range(recipe_steps[step]['bosch']):
31 | if len(str(i_bosch)) < 3:
32 | if len(str(i_bosch)) == 1:
33 | if i_bosch == 9:
34 | i_bosch_str = '0' + str(i_bosch+1)
35 | else:
36 | i_bosch_str = '00' + str(i_bosch+1)
37 | elif len(str(i_bosch)) == 2:
38 | if i_bosch == 99:
39 | i_bosch_str = str(i_bosch+1)
40 | else:
41 | i_bosch_str = '0' + str(i_bosch+1)
42 |
43 | # initial bosch cycle key
44 | key = step + '_bosch-iso' + i_cycle_str + \
45 | '_bosch' + i_bosch_str + '_isotime0'
46 | etch_grid[key] = []
47 | for i_t,t in enumerate(range(t_start,
48 | recipe_steps[step]['iso'],
49 | t_step)):
50 | key = step + '_bosch-iso' + i_cycle_str + \
51 | '_bosch' + i_bosch_str + '_isotime' + str(t+t_step)
52 | etch_grid[key] = []
53 | elif recipe_steps[step]['bosch'] != None and \
54 | recipe_steps[step]['iso'] == None:
55 | # similar key construction but specifically for bosch etching; it
56 | # assumed that each cycle of bosch etching has the same etching
57 | # rate
58 | for i_bosch in range(recipe_steps[step]['bosch']):
59 | if len(str(i_bosch)) < 3:
60 | if len(str(i_bosch)) == 1:
61 | if i_bosch == 9:
62 | i_bosch_str = '0' + str(i_bosch+1)
63 | else:
64 | i_bosch_str = '00' + str(i_bosch+1)
65 | elif len(str(i_bosch)) == 2:
66 | if i_bosch == 99:
67 | i_bosch_str = str(i_bosch+1)
68 | else:
69 | i_bosch_str = '0' + str(i_bosch+1)
70 | else:
71 | i_bosch_str = str(i_bosch+1)
72 |
73 | key = step + '_bosch' + i_bosch_str
74 | etch_grid[key] = []
75 |
76 | elif recipe_steps[step]['bosch'] == None and \
77 | recipe_steps[step]['iso'] != None:
78 | # similar key construction but specifically for iso etching; it is
79 | # possible to have multiple cycles of iso etching, i.e. each with
80 | # different conditions/rates
81 | for i_t,t in enumerate(range(t_start,
82 | recipe_steps[step]['iso'],
83 | t_step)):
84 | key = step + '_iso' + i_cycle_str + '_isotime' + \
85 | str(t+t_step)
86 | etch_grid[key] = []
87 |
88 | return etch_grid
89 |
90 | def ion_source_dist(theta, sigma=1):
91 | J = 1/(sigma*np.sqrt(2*np.pi))*np.exp(-(theta-np.pi/2)**2/(2*sigma**2))
92 | return J
93 |
94 | def etch_rate():
95 | k_b = 1.38e-23 # Boltzman constant [J/K]
96 | T_s = 100 + 274.15 # substrate temperature [K]
97 | k_0 = np.linspace(0,30,30)
98 | F_r = 150 # flow rate of SF6 [sccm]
99 | return
100 |
101 | #def cross_section_slice(cells,p1=(-200,200),p2=(200,-200)):
102 | # if type(cells) == dict:
103 | # cells = np.array(list(cells.keys()))
104 |
105 | def cross_section_slice(cells,cell_size,p1=(-200,200),p2=(200,-200)):
106 | from scipy import spatial
107 | if type(cells) == dict:
108 | cells = np.array(list(cells.keys()))
109 | elif type(cells) == set:
110 | cells
111 | x_slice = np.arange(p1[0],p2[0],cell_size)
112 | m = (p2[1] - p1[1])/(p1[0] - p2[0])
113 | y_slice = m*(x_slice-p1[0]) + p1[1]
114 | xy = cells[:,:2]
115 | z = []
116 | x_out = []
117 | for x,y in zip(x_slice,y_slice):
118 | pt = np.asarray((x,y))
119 | dist = np.sum((xy-pt)**2,axis=1)
120 | ind = np.argmin(dist)
121 | z.append(cells[ind,2])
122 | x_out.append(x)
123 |
124 | # for x,y in zip(x_slice,y_slice):
125 | # dist,ind = spatial.KDTree(xy).query((x,y))
126 | # z.append(cells[ind,2])
127 | # x_out.append(x)
128 |
129 | return(x_out,z)
130 |
131 | def compute_neigh_centers(cell,cell_size,wafer_thickness=500):
132 | # compute face neighbor cells
133 | x, y, z = cell[0], cell[1], cell[2]
134 | d = cell_size
135 | neighs = [[x-d, y, z],[x+d, y, z],
136 | [x, y-d, z],[x, y+d, z],
137 | [x, y, z-d],[x, y, z+d]]
138 | if z+d > wafer_thickness:
139 | neighs = neighs[:-1]
140 | return np.around(np.array(neighs),3)
141 |
142 |
143 | def get_neighbor_cell_ids(grid, cell_idx):
144 | """helper to get neighbor cell IDs."""
145 | cell = grid.GetCell(cell_idx)
146 | pids = pv.vtk_id_list_to_array(cell.GetPointIds())
147 | neighbors = set(grid.extract_points(pids)['vtkOriginalCellIds'])
148 | neighbors.discard(cell_idx)
149 | return np.array(list(neighbors))
150 |
151 | def plot_point_cloud(clouds,scalar='z'):
152 | exp_c = clouds[0]
153 | n_c = clouds[1]
154 | x_exp,y_exp,z_exp = exp_c[:,0],exp_c[:,1],exp_c[:,2]
155 | x_n,y_n,z_n = n_c[:,0],n_c[:,1],n_c[:,2]
156 |
157 | if scalar == 'z':
158 | cs_exp = z_exp
159 | cs_n = z_n
160 | else:
161 | cs_exp = scalar
162 | cs_n = z_n
163 |
164 | fig_exp = mpl.pyplot.figure()
165 | ax_exp = fig_exp.add_subplot(111, projection='3d')
166 | colors_map = 'rainbow'
167 | cm = mpl.pyplot.get_cmap(colors_map)
168 | c_norm = mpl.colors.Normalize(vmin=min(cs_exp), vmax=max(cs_exp))
169 | scalarMap = mpl.cm.ScalarMappable(norm=c_norm, cmap=cm)
170 | scalarMap.set_array(cs_exp)
171 | fig_exp.colorbar(scalarMap, shrink=0.5, aspect=5)
172 | ax_exp.scatter(x_exp, y_exp, z_exp, c=scalarMap.to_rgba(cs_exp))
173 | ax_exp.set_title('exposed points')
174 |
175 | fig_n = mpl.pyplot.figure()
176 | ax_n = fig_n.add_subplot(111, projection='3d')
177 | colors_map = 'rainbow'
178 | cm = mpl.pyplot.get_cmap(colors_map)
179 | c_norm = mpl.colors.Normalize(vmin=min(cs_n), vmax=max(cs_n))
180 | scalarMap = mpl.cm.ScalarMappable(norm=c_norm, cmap=cm)
181 | scalarMap.set_array(cs_n)
182 | fig_n.colorbar(scalarMap, shrink=0.5, aspect=5)
183 | ax_n.scatter(x_n, y_n, z_n, c=scalarMap.to_rgba(cs_n))
184 | ax_n.set_title('neighbor points')
185 | return
186 |
187 | def plot_keys(container):
188 | x, y, z = [], [], []
189 | for key in container:
190 | if type(key) == str: key = eval(key)
191 | x.append(key[0])
192 | y.append(key[1])
193 | z.append(key[2])
194 |
195 | fig = mpl.pyplot.figure()
196 | ax = fig.add_subplot(111, projection='3d')
197 | ax.scatter(x, y, z)
198 |
199 | def remove_cells_by_state(grid, extract_idx):
200 | out_grid = grid.threshold([0,1], scalars='state', invert=False)
201 | out_grid.cell_arrays['on_surface'].dtype = bool
202 | out_grid.cell_arrays['exposed'].dtype = bool
203 | out_grid.cell_arrays['in_mask'].dtype = bool
204 | out_grid.cell_arrays['is_neighbor'].dtype = bool
205 | return out_grid
206 |
207 | def make_cloud(container,is_cont_of_str=False):
208 | # helper function to make an np point cloud from cell center points
209 | # passed from dictionary keys or set of tuples
210 | # columns x, y, z, state, id where id = 1 is exposed and id = 0 is neighbor
211 | pt_clouds = []
212 | state = []
213 | iden = []
214 | for i_cont, cont in enumerate(container):
215 | if type(cont) == set and is_cont_of_str == False:
216 | cont = list(cont)
217 | for i_cell, cell_center in enumerate(cont):
218 | if type(cell_center) == str:
219 | cell_center = eval(cell_center)
220 | pt_clouds.append([cell_center[0], cell_center[1], cell_center[2]])
221 | try:
222 | state.append(cont[cell_center]['state'])
223 | iden.append(1)
224 | except:
225 | state.append(0)
226 | iden.append(0)
227 | out_cloud = np.array(pt_clouds)
228 | out_state = np.array(state)
229 | out_iden = np.array(iden)
230 | return(out_cloud,out_state,out_iden)
231 |
232 |
233 |
234 | def make_grid(containers,cell_size):
235 | # helper function to make an unstructured grid from cell center points
236 | # passed from dictionary keys or set of tuples
237 | grids = []
238 | for container in containers:
239 | cells = []
240 | offset = np.arange(0,9*len(container),9)
241 | cell_type = np.array([pv.vtk.VTK_HEXAHEDRON]*len(container))
242 | point_count = 0
243 | pts = np.array([])
244 | if type(container) == set: container = list(container)
245 | n_pts = len(container)
246 | for i_cell, cell_center in enumerate(container):
247 | if i_cell%1000 == 0: print('building cell %i of %i' %(i_cell,n_pts))
248 | cells.append(8)
249 | for p in range(8):
250 | cells.append(point_count)
251 | point_count += 1
252 | x = cell_center[0]
253 | y = cell_center[1]
254 | z = cell_center[2]
255 | d = cell_size/2
256 | # note this order
257 | cell_pts = np.array([[x-d,y-d,z-d],[x+d,y-d,z-d],
258 | [x+d,y+d,z-d],[x-d,y+d,z-d],
259 | [x-d,y-d,z+d],[x+d,y-d,z+d],
260 | [x+d,y+d,z+d],[x-d,y+d,z+d]])
261 | cell_pts = np.around(cell_pts,3)
262 | if pts.size == 0:
263 | pts = cell_pts
264 | else:
265 | pts = np.vstack((pts,cell_pts))
266 | # make unstructured grid
267 | grid = pv.UnstructuredGrid(offset,
268 | np.array(cells),
269 | cell_type,
270 | pts)
271 | # make cell_arrays for state and neighbor label
272 | grid.cell_arrays['state'] = np.zeros(len(container))
273 | grid.cell_arrays['neighbor'] = [False] * len(container)
274 | # assuming exposed cells with a state are in a dict
275 | if type(container) == dict:
276 | for i_cell,cell_center in enumerate(container):
277 | grid.cell_arrays['state'][i_cell] = \
278 | container[cell_center]['state']
279 | # and type set (now list) is the neighbor cells
280 | elif type(container) == list:
281 | for i_cell,cell_center in enumerate(container):
282 | grid.cell_arrays['neighbor'][i_cell] = True
283 | # add to grids list
284 | grids.append(grid)
285 | # now merge grids together
286 | if len(grids) == 2:
287 | out_grid = grids[0].merge(grids[1],merge_points=False,
288 | main_has_priority=True)
289 | elif len(grids) == 1:
290 | out_grid = grids[0]
291 | else:
292 | out_grid = grids[0]
293 | for grid in grids[1:]:
294 | out_grid = out_grid.merge(grid,merge_points=False,
295 | main_has_priority=True)
296 |
297 | return out_grid
298 |
299 |
300 |
301 | def plot_subset_index_cells(grid,i_cell,i_neighbors=np.array([])):
302 | plotter = pv.BackgroundPlotter()
303 | plotter.add_mesh(grid.extract_all_edges(), color='k', label='whole mesh')
304 | if i_neighbors.size != 0:
305 | plotter.add_mesh(grid.extract_cells(i_neighbors), color=True,
306 | opacity=0.5, label='neighbors')
307 | plotter.add_mesh(grid.extract_cells(i_cell), color='pink',
308 | opacity=0.75, label='the cell')
309 | plotter.add_legend()
310 | plotter.show()
311 | return
312 |
313 | def plot_subset_cells(grid,subset=None,scalar='z',invert=False):
314 | if invert == True:
315 | flag_bool = False
316 | flag_int = 0
317 | else:
318 | flag_bool = True
319 | flag_int = 1
320 | if subset != None:
321 | extract_idx = []
322 | for i_cell in range(grid.n_cells):
323 | if grid.cell_arrays[subset][i_cell] == flag_bool \
324 | or grid.cell_arrays[subset][i_cell] == flag_int:
325 | extract_idx.append(i_cell)
326 | plot_grid = grid.extract_cells(extract_idx)
327 | else:
328 | plot_grid = grid
329 | plotter = pv.BackgroundPlotter()
330 | if scalar == 'z':
331 | s = plot_grid.points[:,2]
332 | else:
333 | s = plot_grid.cell_arrays[scalar]
334 |
335 | plotter.add_mesh(plot_grid, show_edges=True, scalars=s)
336 | plotter.add_scalar_bar()
337 | return
338 |
339 |
340 | def is_in_mask(x,y,mask_paths,radius=0.0,alt_label=False):
341 | # True for x, y pt coord in mask contour
342 | # False for x, y pt coord not in mask contour
343 | # SurfBound for pt on wafer surface as the boundary of an etch
344 |
345 | try:
346 | # one point, multiple mask paths
347 | for path in mask_paths:
348 | if alt_label != False:
349 | inside = alt_label
350 | break
351 | elif alt_label == False:
352 | inside = mask_paths[path].contains_point((x,y),radius=radius)
353 | if inside == True: break
354 | except:
355 | # multiple points, one mask path
356 | if type(x) is np.ndarray:
357 | if alt_label == False:
358 | inside = [mask_paths.contains_point((i,j),radius=radius) \
359 | for i,j in zip(x,y)]
360 | elif alt_label != True:
361 | inside = [alt_label for i,j in zip(x,y)]
362 | else:
363 | if alt_label == False:
364 | inside = mask_paths.contains_point((x,y),radius=radius)
365 | elif alt_label != False:
366 | inside = alt_label
367 | return inside
368 |
369 |
370 | def fix_norm(pt, norm, ref_pt=np.array([0,0,0])):
371 | # determine angle between unit vector and vector between point and
372 | # reference point
373 | ref_vec = np.array([pt[0] - ref_pt[0],
374 | pt[1] - ref_pt[1],
375 | pt[2] - ref_pt[2]])
376 | ref_vec = ref_vec / np.linalg.norm(ref_vec)
377 | angle1 = np.arccos(np.dot(ref_vec,norm))
378 | angle2 = np.arccos(np.dot(ref_vec,-1*norm))
379 |
380 | if angle2 > angle1:
381 | pass
382 | else:
383 | norm = -1*norm
384 | return norm
385 |
386 | def compute_normals(points, use_nn=False, flat=False,
387 | ref_pt=np.array([0,0,0])):
388 |
389 | if flat == True: # add dummy layer to make 3D
390 | knns = NearestNeighbors(n_neighbors=use_nn).fit(points)
391 | dists, indices = knns.kneighbors(points, return_distance=True)
392 | dummy_dist = np.mean(dists)
393 | dummy_upper_layer = np.copy(points)
394 | dummy_lower_layer = np.copy(points)
395 | dummy_lower_layer[:,-1] -= dummy_dist
396 | dummy_upper_layer[:,-1] += dummy_dist
397 | dummy_layers = np.vstack((dummy_upper_layer, dummy_lower_layer))
398 | points = np.vstack((points,dummy_layers))
399 |
400 |
401 | # fig = mpl.pyplot.figure()
402 | # ax = fig.add_subplot(111, projection='3d')
403 | if type(points) == dict:
404 | pts = [list(cell) for cell in points]
405 | cloud = make_cloud([points])[0]
406 | cloud = np.vstack((cloud,pts))
407 | knns = NearestNeighbors(n_neighbors=use_nn).fit(cloud)
408 | dists, indices = knns.kneighbors(cloud, return_distance=True)
409 | unit_norms = {}
410 | for pt in list(points.keys()):
411 | x, y, z = pt[0], pt[1], pt[2]
412 | nns = []#[[x, y, z]]
413 | # collect coordinates of nns
414 | pt_i = np.where((cloud[:,0] == x) & (cloud[:,1] == y)
415 | & (cloud[:,2] == z))[0]
416 | pt_i = pt_i[0]
417 | for nn in range(use_nn):
418 | nns.append(cloud[indices[pt_i][nn],:])
419 | nns = np.array(nns)
420 |
421 | # compute centroid and shift points relative to it
422 | cent = np.mean(nns, axis=0)
423 | xyzR = nns - cent
424 | u, sigma, v = np.linalg.svd(xyzR)
425 | unit_norm = v[2] / np.linalg.norm(v[2])
426 | unit_norms[pt] = unit_norm
427 |
428 | elif points.shape[0] == 3:
429 | cloud = make_cloud([points])[0]
430 | knns = NearestNeighbors(n_neighbors=use_nn).fit(cloud)
431 | dists, indices = knns.kneighbors(cloud, return_distance=True)
432 | pt_i = np.where((cloud[:,0] == points[0]) & (cloud[:,1] == points[1])
433 | & (cloud[:,2] == points[2]))[0]
434 | if pt_i.shape[0] == 0:
435 | cloud = np.vstack((cloud,points))
436 | knns = NearestNeighbors(n_neighbors=use_nn).fit(cloud)
437 | dists, indices = knns.kneighbors(cloud, return_distance=True)
438 | pt_i = np.where((cloud[:,0] == points[0]) & (cloud[:,1] == points[1])
439 | & (cloud[:,2] == points[2]))[0]
440 | nns = []
441 | for nn in range(use_nn):
442 | nns.append(cloud[indices[pt_i][0][nn],:])
443 | nns = np.array(nns)
444 | cent = np.mean(nns, axis=0)
445 | xyzR = nns - cent
446 | # xyzRT = np.transpose(xyzR)
447 | # compute singular value decomposition
448 | u, sigma, v = np.linalg.svd(xyzR)
449 | unit_norms = v[2] / np.linalg.norm(v[2])
450 |
451 | else:
452 | knns = NearestNeighbors(n_neighbors=use_nn).fit(points)
453 | dists, indices = knns.kneighbors(points, return_distance=True)
454 | unit_norms = []
455 | for pt_i, pt in enumerate(points):
456 | x, y, z = pt[0], pt[1], pt[2]
457 | nns = []#[[x, y, z]]
458 | # collect coordinates of nns
459 | for nn in range(use_nn):
460 | nns.append(points[indices[pt_i][nn],:])
461 | nns = np.array(nns)
462 |
463 | # compute centroid and shift points relative to it
464 | cent = np.mean(nns, axis=0)
465 | xyzR = nns - cent
466 | # xyzRT = np.transpose(xyzR)
467 | # compute singular value decomposition
468 | u, sigma, v = np.linalg.svd(xyzR)
469 | unit_norm = v[2] / np.linalg.norm(v[2])
470 | unit_norms.append(unit_norm)
471 | # determine angle between unit vector and vector between point and
472 | # reference point
473 | # ref_vec = np.array([x - ref_pt[0], y - ref_pt[1], z - ref_pt[2]])
474 | # ref_vec = ref_vec / np.linalg.norm(ref_vec)
475 |
476 | # angle1 = np.arccos(np.dot(ref_vec,unit_norm))
477 | # angle2 = np.arccos(np.dot(ref_vec,-1*unit_norm))
478 |
479 | # if angle2 > angle1:
480 | # unit_norms.append(unit_norm)
481 | # else:
482 | # unit_norms.append(-1*unit_norm)
483 | unit_norms = np.array(unit_norms)
484 | return unit_norms
485 |
486 | def compute_angle(norm,ref_pt=[0,0,0],wafer_thickness=500):
487 | ref_vec = np.array([ref_pt[0] - ref_pt[0],
488 | ref_pt[1] - ref_pt[1],
489 | wafer_thickness + 10 - ref_pt[2]])
490 | ref_vec = ref_vec / np.linalg.norm(ref_vec)
491 | angle1 = np.arccos(np.dot(ref_vec,norm))
492 | angle2 = np.arccos(np.dot(ref_vec,-1*norm))
493 | if angle2 > angle1:
494 | angle = angle1
495 | else:
496 | angle = angle2
497 | return angle
498 |
499 |
500 | def normal_from_neighbors(cell_tuple,removed_cells,cell_size,n_cells_span=2,
501 | wafer_thickness=500):
502 | x, y, z = cell_tuple[0], cell_tuple[1], cell_tuple[2]
503 | x_c = np.linspace(x-n_cells_span*cell_size,
504 | x+n_cells_span*cell_size,2*n_cells_span+1)
505 | y_c = np.linspace(y-n_cells_span*cell_size,
506 | y+n_cells_span*cell_size,2*n_cells_span+1)
507 | z_c = np.linspace(z-n_cells_span*cell_size,
508 | z+n_cells_span*cell_size,2*n_cells_span+1)
509 | x_mesh, y_mesh, z_mesh = np.meshgrid(x_c, y_c, z_c)
510 |
511 | radius = (cell_size/2) * np.sqrt(24) + (cell_size/2)*0.1
512 | # radius = ((cell_size) * np.sqrt(5)) + (cell_size/2)*0.2
513 |
514 | normal_vect = [0,0,0]
515 |
516 | for i in range(len(x_c)):
517 | for j in range(len(y_c)):
518 | for k in range(len(z_c)):
519 | x_v = round(x_mesh[i,j,k],3)
520 | y_v = round(y_mesh[i,j,k],3)
521 | z_v = round(z_mesh[i,j,k],3)
522 | if (str((x_v,y_v,z_v)) in removed_cells and \
523 | z_v <= wafer_thickness):
524 | d = np.sqrt((x-x_v)**2 + (y-y_v)**2 + (z-z_v)**2)
525 | if d < radius:
526 | normal_vect[0] += round(x_v - x,3)
527 | normal_vect[1] += round(y_v - y,3)
528 | normal_vect[2] += round(z_v - z,3)
529 | if np.linalg.norm(normal_vect) == 0:
530 | normal_vect = compute_normals(np.array(list(cell_tuple)), use_nn=8,
531 | ref_pt=np.array([0,0,0]))
532 | unit_norm = normal_vect / np.linalg.norm(normal_vect)
533 | return(unit_norm)
534 |
535 | def compute_normals_for_cells(exposed_cells,removed_cells,
536 | cell_size,n_cells_span=2,
537 | wafer_thickness=500):
538 |
539 | radius = (cell_size/2) * np.sqrt(24) + (cell_size/2)*0.102
540 | normal_vects = {}
541 | alt_method_points = {}
542 | alt_method_flag = False
543 | n_cells = len(exposed_cells)
544 | n_normals = 0
545 | try:
546 | keys = list(exposed_cells.keys())
547 | except:
548 | keys = [exposed_cells]
549 |
550 | for i_cell, cell in enumerate(keys):
551 | if i_cell%(int(n_cells/5)) == 0:
552 | print('\t\tcomputing normal for pt %i of %i' %(i_cell,n_cells))
553 | try:
554 | need_norm = exposed_cells[cell]['need_norm']
555 | except:
556 | need_norm = True
557 | if need_norm == True:
558 | n_normals += 1
559 | x, y, z = cell[0], cell[1], cell[2]
560 | x_c = np.linspace(x-n_cells_span*cell_size,
561 | x+n_cells_span*cell_size,2*n_cells_span+1)
562 | y_c = np.linspace(y-n_cells_span*cell_size,
563 | y+n_cells_span*cell_size,2*n_cells_span+1)
564 | z_c = np.linspace(z-n_cells_span*cell_size,
565 | z+n_cells_span*cell_size,2*n_cells_span+1)
566 | x_mesh, y_mesh, z_mesh = np.meshgrid(x_c, y_c, z_c)
567 | normal_vect = [0,0,0]
568 | for i in range(len(x_c)):
569 | for j in range(len(y_c)):
570 | for k in range(len(z_c)):
571 | x_v = round(x_mesh[i,j,k],3)
572 | y_v = round(y_mesh[i,j,k],3)
573 | z_v = round(z_mesh[i,j,k],3)
574 | if (str((x_v,y_v,z_v)) in removed_cells and \
575 | z_v <= wafer_thickness):
576 | d = np.sqrt((x-x_v)**2 + (y-y_v)**2 + (z-z_v)**2)
577 | if d < radius:
578 | normal_vect[0] += round(x_v - x,3)
579 | normal_vect[1] += round(y_v - y,3)
580 | normal_vect[2] += round(z_v - z,3)
581 | if np.linalg.norm(normal_vect) == 0:
582 | # if this happens it is a point in space with a spehere of
583 | # neighbors making the normal_vect magnitude 0 so jus assign a
584 | # dummy normal
585 | normal_vects[cell] = np.array([1,0,0])
586 | alt_method_flag = True
587 | else:
588 | normal_vects[cell] = (normal_vect / np.linalg.norm(normal_vect))
589 | if alt_method_flag == True:
590 | print('COMPUTING NORMAL W/ PT CLOUD')
591 | try:
592 | alt_method_points = compute_normals(alt_method_points, use_nn=5)
593 | except:
594 | print('\tEXCEPTION')
595 |
596 | print('\t\t\tassigning normals to %i cells' % n_normals)
597 | for cell in list(exposed_cells.keys()):
598 | if exposed_cells[cell]['need_norm'] == True:
599 | try:
600 | exposed_cells[cell]['normal'] = normal_vects[cell]
601 | except:
602 | exposed_cells[cell]['normal'] = alt_method_points[cell]
603 | exposed_cells[cell]['need_norm'] = False
604 | return exposed_cells
--------------------------------------------------------------------------------