New Research In
Articles by Topic
- Agricultural Sciences
- Applied Biological Sciences
- Biophysics and Computational Biology
- Cell Biology
- Developmental Biology
- Environmental Sciences
- Immunology and Inflammation
- Medical Sciences
- Plant Biology
- Population Biology
- Psychological and Cognitive Sciences
- Sustainability Science
- Systems Biology
快乐十分怎么老是输钱:Interactive programming paradigm for real-time experimentation with remote living matter
Biology cloud laboratories are an emerging approach to lowering access barriers for life-science experimentation. However, suitable programming approaches and interfaces are lacking for both domain experts and lay users, especially ones that enable interaction with the living matter itself and not just the control of equipment. Here we present a programming paradigm for real-time interactive applications with remotely housed biological systems which is accessible and useful for scientists, programmers, and lay people. Our user studies show that scientists and nonscientists are able to rapidly develop a variety of applications, such as interactive biophysics experiments and games. This paradigm has the potential to make first-hand experiences with biology accessible to all of society and to accelerate the rate of scientific discovery.
- human–computer interaction
- cloud laboratory
- augmented reality
- swarm programming
- interactive biotechnology
Life-science research is increasingly accelerated through the advancement of automated, programmable instruments (1). Nevertheless, many usage barriers to such instruments exist, primarily due to physical access restrictions, advanced training needs, and limitations in programmability. Equivalent barriers for computing (2?–4) have been solved through application programming interfaces (APIs) (5), domain-specific applications, and cloud computing (6??–9). Consequently, cloud laboratories to remotely experiment with biological specimens have been developed and deployed for academia and industry (10), with applications including citizen science games (11, 12) and online education (13??–16). Different approaches have been taken to make automated wet laboratory instruments programmable: Roboliq (17) uses artificial intelligence (AI) to ease the development of complex protocols to instruct liquid-handling robots; BioBlocks (18) and Wet Lab Accelerator (10) are web-based visual programming environments for specifying instrument protocols on cloud laboratories like Transcriptic (10).
Beyond programming automated laboratory experiments, we propose that there is an emerging need for a more general programming paradigm that allows users to develop applications that enable real-time interaction with the living matter itself. In analogy to conventional computers, this can be seen as the difference between numerical calculations by mathematicians vs. truly interactive applications like word processing (19), interactive graphical programs (20), and computer games (21) used by all strata of society. In other words, first-hand interactive experience with microbiology should become accessible for everyone. Such concepts of “human–biology interaction” (HBI) have been explored previously through interactive museum installations (22) and educational games (23), but both the software and the hardware always had to be developed from the ground up. Swarm programming abstractions for easier development of interactive applications have also been proposed (24). Other potential future applications include living material microfabrication through light stimulation, self-assembly, and swarm robotics, e.g., with engineered bacteria or molecular motors (25?????–31).
We determined through iterative design, development, and user testing that a system for remotely programming living matter should ideally have the following minimal set of components (Fig. 2): (i) end user and programming environments supporting online and event-driven application conventions (Fig. 2 A, B, and H); (ii) a set of programming functions for manipulating and sensing biological matter and integrating with standard programming logic (Fig. 2C); (iii) biotic processing units (BPUs) to digitally interface with the biological specimen and where a cluster of such BPUs is hosted on the cloud (Fig. 2E); (iv) virtual BPUs that simulate all real BPU functionalities, allowing the users to switch between virtual and real BPUs (Fig. 2 D and F); and (v) real-time conversion of BPU raw output into high-level accessible data structures (Fig. 2G). We developed Bioty as a specific implementation that integrates all of these components. Humans working with this system fall into the two categories of end users and programmers, i.e., the latter developing applications in Bioty that the former then use.
In analogy to electronic microprocessors like GPUs (33), BPUs (14, 15, 34) are devices that house, actuate, and measure microbiological systems. Computing is defined by the Association of Computing Machinery (ACM) as a series of “processes that describe and transform information” (35). A BPU performs biological computation by transforming a digital input into a physical stimulus affecting analog biological behavior, which is then converted back into digital output. A BPU can be programmed like a conventional microprocessor, with a domain-specific instruction set where the “computational algorithms” are realized through the nondeterministic biological behavior and responses of living matter (24).
Here, we use a previously described BPU architecture (14) (Fig. 2E): Photophobic E. gracilis cells (36, 37) are housed in a quasi-2D microfluidic chip, and the modifiable light intensity of four LEDs placed in each cardinal direction can stimulate cells to swim away from light, which is recorded by a microscope camera (Fig. 2E). More complex responses are also possible (24).
Biology Cloud Laboratory.
A cloud laboratory has the advantage of making biology experiments accessible from anywhere. However, the presented programming paradigm is equally suitable for a local implementation.
We developed Bioty over the existing cloud laboratory architecture, described previously in ref. 14, which provides real-time interactive access to a cluster of BPUs (Fig. 2E), e.g., through a virtual joystick.
Biological Data Structures.
The raw data stream from a BPU should be preprocessed in real time into higher-level data types that enable direct access to state variables about the biological material. Such abstractions allow programmers to treat biological objects (e.g., cells) like sprites (38) or objects in a database, whose state (e.g., position or gene expression level) can be queried and manipulated in real time (24).
In Bioty, we implemented a continuous image-processing layer where cells are continuously tracked (Fig. 2G). Information about individual cells, such as position and orientation, is extracted and associated with a cell index. The resulting data structures can be queried directly.
There are three fundamental categories of functions for programming living matter: stimulus control (actuation), organism sensing, and application creation (Fig. 2C). The actuating (“writing”) functions affect the state of biological matter via a physical stimulus inside the BPU. The sensor functions “read” the state of the biological matter. The application creation functions consist of all other standard programming functionalities.
An accessible user interface and programming environment is required to reflect the standards of online (39, 40) and event-driven (41, 42) development environments. Interactive applications can then be developed and executed by any programmer and end user.
The client side of Bioty (Figs. 2 B and H and 3 and Movie S1) has a programming interface (Fig. 3B) and program output that includes the live BPU camera feed with virtual overlays generated by the program (Fig. 3C). The programming interface has distinct areas (Fig. 3B) for five separate event-driven functions: (i) “startProgram” runs at the beginning of program execution, (ii) “endProgram” runs after program termination, (iii) “run” continuously operates during program execution at a rate of 1 kHz, (iv) “onKeypress” runs when the user presses a key, and (v) the “onJoystickChange” function runs when the user operates the virtual joystick (Fig. 3D). The joystick angle and magnitude map to LED intensity inside the BPU. Standard features supporting programmers and end users are in place (Fig. 3A): “Run” and “Stop” buttons trigger the “startProgram” and “endProgram” events, and “Save Code” and “Load Code” buttons enable file handling. A user can also run the same program on different BPUs.
Simulation Mode—Virtual BPU.
A virtual BPU should be integrated that can be programmatically accessed equivalently to a real BPU, using the same programming commands. It should be simple for a user to switch between the real and simulated BPUs. Here, the actual “biological computation” will likely not be the same for the real and virtual BPU, as the fidelity of the underlying model is typically limited by incomplete knowledge of the biological system and by computational power. The virtual BPU is useful as it (i) allows fast and cheap testing and debugging of programs, (ii) enables application development even if there are more developers than available real BPUs, and (iii) enables life-science research involving developing models of the biological system that can be tested side by side against the live experiment.
We implemented the virtual BPU (Fig. 2F) with a simple model where animated Euglena respond to simulated “setLED” stimuli with negative phototaxis (Materials and Methods). We integrated the virtual BPU on the client side, but it could also be implemented on the server side, depending on computational resource limitations on either end. The virtual BPU feeds simulated images into the real-time image processing module.
Results: Use Cases and User Studies
S1a: Example Applications.
We first illustrate the programming potential and versatility of Bioty through three applications created by the research team. These applications also provide use cases of the stimulus control, biology sensor, and application creation functions while requiring comparably few lines of code (20, 71, and 30 lines, respectively).
This program tracks all live cells on screen and draws a box around each with the tracking ID displayed next to the box (Fig. 4A and Movie S2). This program provides direct visualization of the underlying tracking algorithm.
This program provides a visualization of the instantaneous average Euglena velocity (Fig. 4B). This program might be useful for real-time data visualization during biophysics experimentation.
Guess the LED game.
This program is a “biotic game” (43) where the player has to guess which of the four LEDs is switched on based on the observed direction of Euglena swarm movement (Fig. 4C and Movie S3), scoring one point for every correct guess. This game might be used to teach about phototactic behavior, also highlighting the few seconds of delay between light stimulus and cell reorientation.
S1b: HBI Novices as Programmers.
All participants worked on site in our laboratory. To first familiarize the participants with this programming paradigm, they completed two structured tasks where they modified existing programs (see Table 1 for completion times as well as SI Appendix, section S3: Further description of study S1b). They then performed two free-form programming tasks to determine the types of applications that novices may create (Table 1). All seven participants completed all required structured programming tasks (detailed in SI Appendix, section S3) and developed at least one free-form application, although some programs had some bugs due to the 2-h time constraint. Task completion times appear in Table 1; no single programming task took more than 1.5 h to complete.
To simulate BPU timing constraints with an increased number of users, we implemented a fixed session time, after which the participants were locked out of the BPU and had to log back in, either to the same BPU or to a new one. The first two participants commented on the inconvenience of this time pressure (example quote: “I have to admit, having the timer count down while I’m writing my code is pretty stressful”). This suggested to us that it would be beneficial to address these physical resource limitations with a virtual development and execution mode. We then implemented a virtual BPU that simulates the main aspects of the real BPU; from both an end-user and a programmer perspective, the virtual and real BPUs are handled equivalently (Simulation Mode—Virtual BPU and Fig. 2F). All of the following study participants had access to this virtual BPU, including in subsequent studies.
Two notable programs developed by the participants demonstrate biotic games (23) and applications for data collection.
Moving box game.
Swarm movement statistics.
Overall we found that these HBI novices were able to successfully develop versatile applications. In the poststudy questionnaire, seven of seven participants mentioned the ease of use (example quote: “The API was very straightforward and simple to use. It does not take much time to ramp up on the API, which made it fun and way faster to move toward actually using the program”). Optional feedback also indicated that programmers learned new biology during the development process (example quote: “I realized their individual behaviors are really variable; some of them barely respond to light, and some of them respond really quickly”). This suggests that programming with living matter can facilitate experimentation and education online.
S1c: HBI Experts as Programmers.
All participants worked remotely and successfully developed applications of their own choosing, spending between 77 min and 351 min (mean 190.74; median 137 min), using between 42 and 123 lines of code, and using all types (organism, sensor, and application) of available API calls (Fig. 2C) with different frequencies (Table 2 and SI Appendix, section S9). Participants used both the live BPU and the simulation mode, spending the majority of time (77.6%, SD = 11.4%) in the former. Two notable programs developed by the HBI experts demonstrate how they applied the paradigm to real-time data visualization and game design (23).
When asked whether developing their application with Bioty was easier or harder than it would have been using previously existing HBI approaches and tools, all five participants stated that Bioty was much easier. This is also indicated by the relatively low development time and few lines of code (Table 2 and SI Appendix, section S12).
S2: HBI Novices as End Users.
To test whether programs written in Bioty are ultimately usable for end users, we developed a set of educational applications centered around self-guided science experiments (Fig. 6). These applications were developed by the research team as an iteration of a program created by a semiexpert user (Fig. 5D), who was not a member of any study but used Bioty in a hackathon at Stanford University. These applications present moving sliders to control the LED stimulus and real-time visualizations of the average orientation of all organisms depicted through the circular variance (Fig. 3). The main learning goals were as follows: Euglena respond to light, Euglena orient with the direction of the light, Euglena response depends on light intensity, and it takes a few seconds until the Euglena have fully responded and reoriented with the light. These applications build up in complexity over five phases (SI Appendix, Fig. S1). Fig. 6 and Movies S6 and S7 show the final application in the sequence.
Seventeen remote HBI novices aged 22–55 y (mean = 26.0 y, SD = 8.2 y) were recruited to participate in the guided experimentation platform. Eleven of the participants identified as female and six as male. The participants were asked for their prior familiarity with Euglena on a scale from 1 to 5, where 1 corresponds to having never heard of Euglena, 3 corresponds to having read about Euglena before, and 5 corresponds to working with Euglena regularly. The participants’ responses ranged from 1 to 4 (mean = 1.8, SD = 0.4). Among these participants were also three regular Eterna (11) players (>30 h of play per month over the past 1.5 y), who were recruited to gather preliminary insight about Bioty’s potential as a citizen science platform.
After the first activity, 9/17 participants reported movement toward or away from light. The others reported spinning around their axis. After the last activity, 14/17 participants reported that Euglena move away from light. Eleven of 17 participants reported that the intensity of light affects the speed at which the Euglena move away from light. Fourteen of 17 participants reported a response time, ranging from 2 s to 1 min (mean = 31.8 s, SD = 8.7 s). The concept of circular variance was harder to grasp for the participants: Only 9/17 correctly stated that “alpha” meant the average orientation of the cells. Overall, participants stated that they learned certain concepts from the applications, such as the fact that Euglena have a negative response to light. This demonstrates that applications written in Bioty could support science education.
In the poststudy questionnaire, participants were asked free-form questions about their experience. Regarding the difference between using live and simulated experiments, 11/17 found the simulation to be more predictable and reliable. Nevertheless, 13/17 preferred the live mode over simulation, 1/17 preferred the simulations, and the others did not have a preference. Participants also pointed out that while the live mode interacts with real organisms, the simulation is easier to use. When asked about the advantages of a live experimentation platform, two of the Eterna players responded that it is “more factual” and “What you see is what you get.” Hence the feedback combined from all participants is consistent with previous findings [e.g., from the educational literature (45?–47)] that both experiments and simulations synergistically motivate scientific inquiry, and the presented IDE supports both.
We demonstrated a programming API and IDE to perform real-time interactive experiments with living matter, both locally and on a cloud laboratory and for experts and novices alike. Bioty allows for the specification of interactive experimentation, the program execution to adjust to the biological response via real-time feedback, the integration of a simulation mode, and the creation of interactive programs for remote end users. The API allows for organism sensing via real-time object tracking, organism control through user-controlled highly precise light stimuli enabled through communication with a remote web server hosting BPUs, and application development through a user-friendly drawing and program control library.
The usability and ease of application development using this paradigm were successfully evaluated through multiple user studies. Participants from a variety of backgrounds, both HBI novices and experts, mastered the familiarization tasks and developed applications of their choosing. The applications developed in study S1a demonstrated the feasibility of versatile use cases, e.g., data visualization, automated scientific experiments, interactive scientific experiments, art, and games (Fig. 5). Furthermore, the applications were created rapidly (less than 6 h, with less than 150 lines of code; Table 2). The novice HBI users in study S1a indicated that programming had a low barrier to entry, and the HBI experts in study S1b confirmed easier and faster development than previous approaches (14, 15, 22, 24, 48) (SI Appendix, section S5). Study S1c demonstrated that end-user applications (e.g., for science education) can be implemented that leverage the real-time interactivity with the biological substrate. Hence, the Bioty paradigm follows Seymour Papert’s vision of interfaces with “low-floor/high-ceiling/wide-walls” (49, 50) and constitutes a significant step toward making experimentation, engineering, and interaction with living matter more accessible to a broader community.
This programming paradigm for living matter and its implemented architecture (Fig. 2) enables versatile future applications in research, fabrication, and education. It generalizes beyond the domain-specific Euglena biocomputing used here to other biological, chemical, and physical systems with different control capabilities, e.g., chemically responsive bacteria or molecular motors for swarm robotics and living material fabrication (24, 25, 29?–31, 35, 51). Higher spatiotemporal manipulation through more complex light fields and programming abstractions is possible, (e.g., “move cell i right by 5 ”) (24). Citizen science projects like Eterna (11) and Foldit (54) could be supported with enhanced real-time laboratory capabilities. The parallel integration of simulation and live experiments (Fig. 2 E and F) not only reduces load on the physical experimentation resources but could also afford direct model validation, such as with a whole-cell model as described in ref. 55. Formal and informal science, technology, engineering, and mathematics (STEM) education is in significant need of new approaches and technologies to enable inquiry-based science learning (13, 14, 22, 34, 44, 56???–60), which could be supported as well. Augmented reality (61) could deliver versatile and rich worlds as “μAR” in a cost-effective and scalable manner given the small footprint of microbiology. Advancement in high-throughput life-science technologies (62, 63) will increasingly facilitate such applications, considering that Bioty is run on a cloud laboratory that could already support millions of 1-min long experiments per year at a cost of $0.01 each (14). Just as personalized computers and programming APIs revolutionized the accessibility and mass dissemination of interactive computing (64), we believe that programming toolkits like Bioty could stimulate equivalent innovations for the life sciences.
The link to all code used to implement the cloud laboratory is publicly available on GitHub at https://github.com/hirklab/euglenalab (65). The code for Bioty is in the feature/bioticGameProgramming branch.
Materials and Methods
Cloud laboratory implementation.
The Bioty system is developed over the existing cloud laboratory architecture described in ref. 14. The original implementation contained a joystick which allowed users to control remote LEDs on a BPU. A standard web socket connection is used to send the user joystick commands from the web server and the remote BPU, allowing for real-time interaction.
Each BPU consists of a Raspberry Pi which controls four LEDs. The LEDs are placed over a microfluidic chamber that houses the organisms. The Raspberry Pi is also connected to a Raspberry Pi camera that is placed over a microscope lens facing the microfluidic chamber. The frames from the camera are sent back to the web server and displayed to the user in real time.
Real-time image processing.
The client continuously manipulates the frames returned from the microscope’s live feed using the Chrome browser’s Portable Native Client (PNaCl) toolchain. PNaCl executes native C++ code directly in the browser, performing multiple object tracking via the Kalman filtering algorithm for motion prediction (66) in conjunction with the Hungarian algorithm (67) to continuously match detected object contours. The application control functions render over the HTML5 canvas displaying the live video feed.
Application development support.
Some API calls communicate directly with the microscopes, while others perform image processing on the video frames that are returned from the cloud laboratories. This distinction in function implementation is not seen by the user.
The following equations are used as a toy model for the motion of the Euglena simulation, where and are the positions of a Euglena, v is the velocity of a Euglena (which is assumed constant but can vary between individual Euglena), is the angle of a Euglena in the 2D plane, and is the angle of the LED light stimulus, all at time t; is the frame rate, and η is random noise:
Each Euglena is given a random initial position on the screen, a random initial orientation angle, and a constant velocity v sampled from a uniform distribution between 0 and 10 pixels per frame. To calibrate this range of velocities, videos of Euglena were analyzed to determine how many pixels on the HTML5 canvas the Euglena tended to move through per frame. The frame rate is set to 1 frame per 10 ms. ? is the coupling strength, set to ?0.3. Each simulated cell is an ellipsoid with a 5:1 major-to-minor axis ratio.
We used periodic boundary conditions: When a Euglena’s x position moves past the left or right edge of the screen, it retains its y position, velocity, and orientation, appearing on the other side of the screen. The same method is used when its y position moves past the top or bottom edge of the screen. If a Euglena collides with another, as defined by their x and y positions being within 2 pixels of each other, then both Euglena are assigned a new random θ.
This is a simple model capturing the basic idea of Euglena dynamics in response to light. More sophisticated parameter matching between real and simulated Euglena behavior is possible. For example, the Euglena model is not currently dependent on light intensity. Furthermore, more complex models capturing the subtleties of Euglena movement, such as it 3D polygonal motion, helical swimming pattern, and spinning at high light intensities, are possible, but beyond the scope of this work.
All user studies were conducted according to Stanford University IRB-18344. Informed consent was obtained from all participants.
Study S1b—HBI novices as programmers.
To evaluate the remote programming of organisms, we started with an on-site study with programmers performing two structured and two free-form programming tasks. Participants were recruited through online mailing lists and self-described programming ability.
Participants were limited to 2 h of total coding time, including familiarizing themselves with the interface and API. The version of Bioty used in this study did not include a virtual simulation mode for the first two participants, but it was provided to the remainder of the participants (including in subsequent studies) in response to feedback about physical resource limitations. Participants worked on site (instead of remotely), as it allowed us to directly observe their actions and interview them.
Seven programmers aged 21–24 y (mean = 22.57 y, SD = 1.13 y) participated in the on-site study. Three of the participants identified as female and four identified as male. Participants were required to be fluent in English and came from a variety of academic backgrounds. Three participants were undergraduate students at Stanford University, three were full-time software developers, and one was a clinical researcher with some coding experience.
Before starting the familiarization tasks (SI Appendix, section S3), participants were shown a working demonstration of the applications that they were asked to modify. The study researchers were available to answer questions about the API and web interface logistics, but answers to the programming tasks were not provided. After the completion of the first set of structured tasks, participants were asked to complete two free-form programming tasks. Data were recorded on an online Google Form. See SI Appendix, section S3 for more details about the study, including the full set of questions participants were asked.
Study S1c—HBI experts as programmers.
To gather information about system use by domain experts, we recruited participants with prior Euglena HBI development backgrounds to program applications on the platform over an extended period. We aimed to compare their prior experiences with their experience with the remote paradigm.
The procedure for the study with HBI experts was identical to that for the study with novices (study S1b), except for where noted here. The participants were asked to perform two structured and two free-form programming tasks. However, this time the participants were not limited to 2 h of total coding time, instead having 1 wk to complete their programs. The HBI experts were also provided a simulation mode, in response to the feedback from study S1b. Logs of the total time actually spent on the IDE were recorded. See SI Appendix, sections S3 and S4 for more details about the study, including the full set of questions participants were asked.
Study S2—HBI novices as end users.
To test whether programs written using this paradigm are ultimately usable for end users, 17 remote HBI novices aged 22–55 y (mean = 26.0 y, SD = 8.2 y) were recruited to participate in the guided experimentation platform. Participants were recruited through the Stanford University email lists and the Eterna news feed. The remote participants were provided with instructions for interacting with the guided experimentation platform. To verify the usability of the programs developed in an independent online setting, no help with the interface was provided by researchers at any point during the study. The simulation mode was implemented for this study. Participants were asked to use both the simulation and live modes.
To analyze the qualitative results, two raters categorized all quotes. A first rater initially categorized the quotes, followed by a second rater who confirmed the first rater’s categorizations. When there was disagreement, the two raters discussed the categorization of quotes.
See SI Appendix, section S5 for more details about the study.
The authors thank Z. Hossain, T. R. Stones, R. Das, and members of the I.H.R.-K. laboratory for suggestions; all volunteers who tested the system in various stages; and NSF Cyberlearning (1324753) for funding.
- ?1To whom correspondence should be addressed. Email: .
Author contributions: P.W. and I.H.R.-K. designed research; P.W., K.G.S.-G., S.G., A.R., and I.H.R.-K. performed research; P.W., K.G.S.-G., S.G., and A.R. contributed new reagents/analytic tools; P.W. analyzed data; and P.W. and I.H.R.-K. wrote the paper.
The authors declare no conflict of interest.
This article is a PNAS Direct Submission.
Data deposition: The link to all code used to implement the cloud laboratory is publicly available on GitHub (https://github.com/hirklab/euglenalab).
This article contains supporting information online at www.pnas.org/lookup/suppl/doi:10.1073/pnas.1815367116/-/DCSupplemental.
- Copyright ? 2019 the Author(s). Published by PNAS.
This open access article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).
- Sia SK,
- Owens MP
- Wang L, et al.
- Hoffa C, et al.
- Keahey K,
- Figueiredo R,
- Fortes J,
- Freeman T,
- Tsugawa M
- Bloch J
- Corbató FJ,
- Merwin-Daggett M,
- Daley RC
- Murty J
- Zahariev A
- Wilder B
- Lee J, et al.
- Khatib F, et al.
- Hossain Z, et al.
- Hossain Z, et al.
- Hossain Z, et al.
- Hossain Z,
- Riedel-Kruse IH
- Whitehead E,
- Rudolf F,
- Kaltenbach H-M,
- Stelling J
- Gupta V,
- Irimia J,
- Pau I,
- Rodríguez-Patón A
- Zinsser WK
- O’rourke TC, et al.
- Heidel R,
- Snider RE
- Lee SA, et al.
- Cira NJ, et al.
- Lam AT, et al.
- Jin X,
- Riedel-Kruse IH
- Glass DS,
- Riedel-Kruse IH
- Frangipane G, et al.
- McCarty NS,
- Ledesma-Amaro R
- Yao L, et al.
- Rehman RU,
- Paul C
- Owens JD, et al.
- Hossain Z,
- Bumbacher E,
- Blikstein P,
- Riedel-Kruse I
- Comer DE, et al.
- Tsang ACH,
- Lam AT,
- Riedel-Kruse IH
- Resnick M, et al.
- Wong J,
- Hong J
- Myers B,
- Ko A
- Opher E,
- Niblett P,
- Luckham DC
- Kim H, et al.
- Doerr HM
- Ma J,
- Nickerson JV
- de Jong T,
- Linn MC,
- Zacharia ZC
- Kim H,
- Gerber LC,
- Riedel-Kruse IH
- Papert S
- Resnick M,
- Silverman B
- Katz E
- Beni G
- Dede CJ,
- Jacobson J,
- Richards J
- Huang A, et al.
- Stark JC, et al.
- Auer ME,
- Azad AKM,
- Edwards A,
- de Jong T
- Milgram P,
- Takemura H,
- Utsumi A,
- Kishino F
- Balagaddé FK,
- You L,
- Hansen CL,
- Arnold FH,
- Quake SR
- Nichols LM
- Washington P, et al.
- Li X,
- Wang K,
- Wang W,
- Li Y
- Huang C,
- Wu B,
- Nevatia R