prototype
Thirty-three programs/systems were identified in the twenty-one studies, with eight exclusively available for installation on desktop computers and the rest available online, including cloud-based systems, that is, with storage on online servers and availability on demand. Twenty-one of the studied systems were commercially available, charged programs/services, while twelve were free-of-charge for some of their functionalities. Among the non-charged software, two were custom systems designed exclusively for the studied laboratory (Biobank Portal and CCLMS).
Table 4 summarizes the results of the nine studies that assessed the impact of implementing computerized management systems. All of them reported positive results with the use of digital-assisted management. However, problems were identified related to technical constraints (either hardware or software) and limited acceptance of users who resist changing already established procedures, thus impairing the use of some systems to their full potential. Furthermore, the need for staff training and participative management was also recognized to achieve engagement of users to digital-assisted laboratory administration.
Results of the studies that assessed the impact of implementing computerized management systems.
System/Software | Reference | Objective | Test Groups | Method | Results |
---|---|---|---|---|---|
Customer Information Control System/Virtual Storage (CICS/VS) | Delorme and Cournoyer [ ] | Qualitative and quantitative evaluation of system limitations and impact on workflow and man/hour relationships | Software developers and users at the Hospital Lab (N.D.) | Qualitative evaluation of the development of integrated modules; during field testing, the workflow was accessed by the evaluation of patient entry forms, the results of sample and data processing, and the final reports. | |
CCLMS | Selznick et al. [ ] | Test system improvement on organization and control of collections. | Cell culture specialists from 2 labs (n = 6) | Qualitative and quantitative evaluation of usability through the system usability scale (SUS) and field notes. | |
MGEA | Anderson et al. [ ] | Assess the impact on experimental workflow for gene expression analysis. | Researchers (n = 7) | A qualitative longitudinal study. Immersion in the work environment. Interviews, observations, and field notes were coded and analyzed. | |
LINA | Yousef et al. [ ] | Effectiveness and acceptance of an inventory system for management of oligonucleotides, strains, and cell lines | Lab staff (n = 10) | Qualitative analysis of the implementation process; quantitative evaluation of usability through the SUS. | |
AdLIMS | Calabria et al. [ ] | Evaluate effectiveness on sample tracking in genomic studies. | Developers and potential users/clients (N.D.) | Analysis of requirements and expectations of functionalities from users/clients in terms of functionalities; qualitative analysis of the development process. | |
LabCIRS | Dirnagl et al. [ ] | Assess the acceptability, usability of a software of risk assessment for traceability of reported cases. | Lab staff (n = 31) | Statistical and qualitative analysis of the data before and after the implementation of the tool. Online questionnaire with two questions on software usability | |
LC Virtual Biorepository of the Antibacterial Resistance Leadership Group (ARLG) | Manca et al. [ ] | Assess the impact of the implementation of a virtual repository on the management of data and biological collections. | Customers from research labs and diagnostic companies (N.D.) | Qualitative evaluation of the efficiency of the primer bank sequences. Quantitative retrospective assessment of impacts on services provided. | |
Quartzy | Timoteo et al. [ ] | Assess the impact of the implementation in the workflow and the perception of users at an academic laboratory. | Lab staff (n = 30) | Qualitative analysis of the team’s attitude towards implementation, including a structured questionnaire (and focus group assessments). Management performance indicators were also compared before and after implementation. | |
SDLC | Dennert, Friedrich, and Kumar [ ] | Evaluate the development steps of a database of biological sample inventories | Researchers from different fields of medicine (N.D.) | Immersion in the work environment: The cycles of all resources have been developed and tested. User training and interviews were conducted to assess the applicability and identify user’s needs. |
N.D.: non-determined number of participants.
Regarding the management subjects issued according to each laboratory, digital systems were employed for several different uses, from purchases and administrative tasks to control of cell collections, inventories in general, as well as data storage and management of animal colonies.
All the thirty-two described software issued one or more topics of management recommended by documents of good laboratory practices [ 3 ], including experimental workflow, data storage, integration with laboratory equipment, statistical analysis, comparison of experimental data, animal colonies, biorepositories, inventory, and risks. The integration of work demands of academic health sciences laboratories and items of compliance with the GLP guidelines are identified in the chart presented in Figure 2 .
The main applications of the identified software on the different sections and chapters of the OECD GLP Principles [ 3 ].
4.1. contributions to adherence to glp principles.
While the search strategy from the present review identified several different laboratory management systems, few of the eligible studies provided a focused discussion on this topic. The lack of direct scientific evidence limits the present review to quantitatively assess to what extent digital systems can collectively contribute to accreditation achievement. On the other hand, all the identified software accounted for management issues related to at least one of the GLP principles, and, in some studies, more than one software was used to meet the different demands related to quality systems.
In this sense, the approach proposed by Timoteo et al. [ 6 ] could be applied to the present sources of data to chart the main topics of management affected by these programs and systems related to good practice guidelines. The chart presented in Figure 2 shows how the types of management supported by the software in academic laboratories are related to several items from Section II of the OECD GLP Principles [ 3 ]. Such relationship is revealed by an emphasis on the responsibilities of staff and facilities management, work planning, availability of standard operational procedures (SOPs) that cover all study activities, procedure analysis, use and maintenance of equipment, as well as the application of standards for receiving test samples, its chain of custody and logistics, control of inventory, and the traceability of reagents and validation of methods.
For a better understanding of the functions of these systems, a brief presentation of them will be made, with an emphasis on meeting the computerized systems to the GLP principles listed in Figure 2 .
The GLP principles require precise definitions of the different steps during the performance of the study, as described in item #8 of the OECD document [ 3 ], including the responsibilities of the personnel involved, the facilities and status of equipment employed (item #3), among other factors. Furthermore, quality assurance (item #2) requires identifying and monitoring critical steps, checkpoints, and possible sources of errors. Among the different systems identified in the present review, some described digital tools dedicated to managing such workflow of study performance in a systematized fashion.
In the late 1990s, Goodman and colleagues [ 16 ] presented Labflow, a software dedicated to genetics and mapping studies. Workflow management was not recognized as a study topic at that time and, while LIMS already existed, there was no commercial LIMS product that supported workflow management in a specific sense. In this scenario, LabFlow appeared among the first digital solutions, with a workflow model in which objects flow different laboratory tasks (such as DNA extraction, selection of clones, sequence analysis) under programmatic control. An essential point of this software was already allowing the programmer to customize their workflows to different laboratory needs.
Anderson et al. [ 19 ] described, in 2007, the implementation of the Microarray Gene Expression Analysis (MGEA), a software package developed by Rosetta Biosoftware (a subsidiary of Merck Inc.), that helped to integrate workflow information related to experimental design, data collection, and bioinformatic analysis of genomic results. Despite the high costs of the license and its renewals, the authors expected that implementing a commercially available service would bring advantages such as security in terms of support for operation and uniformity between different research centers, thus facilitating communication between employees. However, their qualitative analysis observed that the system was not used to its full potential, and its acceptance by staff would demand ongoing training and even an evolution of academic curricula towards the use of bioinformatics tools.
In 2019, Gaffney et al. [ 11 ] described the design and implementation of GEM-NET, a software that allowed members of the C-GEM (Center for Genetically Encoded Materials, USA) to integrate research efforts connecting six laboratories spread across three university campuses. GEM-NET was designed to support science and communication by integrating task management, scheduling, data sharing, and internal communications. A set of more than 20 tools was organized, including two applications customized for the Institution’s specific needs of workflow management. The tools are highly interconnected, but the set can be divided into access control, data storage, data navigation, project monitoring, teamwork, internal communication, and public engagement. The authors conclude that GEM-NET provides a high level of security and reliability in workflow management.
In different items of the GLP principles, a need is described for the secure storage, filing, and retrieval of research data (item #7.4), including study plans, raw data, final reports, test system samples, and specimens (item #8.3), and their related archiving facilities (item #3.4). Furthermore, item #7 (standard operating procedures) requires the preparation and observance of documents that guarantee the quality and integrity of the data generated by the studies. Sub-item #7.4, for example, describes that in the case of computerized systems, validation, operation, maintenance, security, change control, and the backup system must be observed.
Within the selected studies, we found the report of computerized systems to manage data from various laboratory environments and how they were made available to the research groups. In the early 1980s, Delorme and Cournoyer [ 15 ], in a microbiology laboratory of a University Hospital, tested the CCIS/VS (Customer Information Control System/Virtual Storage), consisting of customer data repository, using a central computer shared with medical records databases, admission offices, patient accounting, and other medical-administrative services. The system also served as a virtual storage system, including data from microbiological samples. It performed activities such as report printing, data quality control, epidemiological assistance, germ identification, teaching, and research in the different subspecialties of microbiology. The authors carried out a qualitative and quantitative assessment identifying an improvement of workflow without increasing personnel, together with a reduction in the time for the production of reports, system downtime, and other parameters.
Viksna et al. [ 20 ] focused on collecting, storing, and retrieving data on research participants and biomedical samples through electronic management. For this, they proposed the PASSIM (Patient and Sample System for Information Management), a web-based customizable system that could be used for sending, managing, and retrieving samples and data from the research subject, ensuring the confidentiality of the records. This tool was instrumental in managing information in clinical research studies involving human beings and replaced the more expensive LIMS, which requires investments of time, effort, and resources that were not always available.
Electronic laboratory notebooks (ELN) are programs designed to replace traditional research notebooks. These electronic tools may register protocols, field/lab observations, notes, and other data inserted through a computer or mobile device, offering several advantages over paper notebooks [ 19 ]. Machina and Wild [ 22 ] investigated the importance of ELNs when integrated with other computer tools, such as laboratory information management systems, analytical instrumentation, data management systems, and scientific data. They observed that the type of laboratory (analytical, synthesis, clinical, research) was a primary source of differences when trying to integrate ELN with the available tools. Therefore, based on the observation that there was no well-established path for the effective integration of these tools, the authors decided to review and evaluate some of the adopted approaches.
Calabria et al. [ 24 ], in 2015, introduced adLIMS, a software for managing biological samples (primarily DNA) and metadata for patient samples and experimental procedures. The authors described how it was possible to produce this system by customizing a previous open-source software, ADempiere ERP. First, they collected the requirements of the end-users, verifying the desired functionalities of the system and Graphical User Interface (GUI), and then evaluated the available tools that met the desired requirements, ranging from pure LIMS to content management and corporate information systems. The authors report that the system supported critical issues of sample tracking, data standardization, and automation related to NGS (next-generation sequencing).
By 2021, Cooper et al. [ 30 ] reported using integrated systems that ensure the sharing of essential data for current research. The authors followed the 15 years of development and implementation of the LabDB system, initially projected to manage structural biology experiments, which could be improved into a sophisticated system that integrates a range of experimental biochemical, biophysical, and crystallographic data. The LabDB central software module handles data from the management of laboratory personnel, chemical stocks, and storage locations. It is currently used by the American/Canadian consortium CSGID (Center for Structural Genomics of Infectious Diseases) and several prominent research centers. The authors identified the difficulties and resistance of some researchers in adopting these systems as the main limitation, often due to the necessary effort to import data from electronic notebooks or laboratory spreadsheets, with which most researchers are already familiar. Nevertheless, the authors consider that this effort is worth it since these older approaches do not remove or even track inconsistencies and do not adapt well to the requirements of modern research.
It is essential to notice that, for accreditation purposes, hosted services (cloud archiving, backup, or processes) require written agreements describing the responsibilities of the informatics services. Test facility management must be aware of potential risks on data integrity resulting from third-party storage.
Adherence to the GLP principles speaks to the adequate management of research equipment (OECD item #4), including their adequate calibration, maintenance, scheduling, and responsible staff in the test facility. Several commercially available systems, such as QRESERVE, cited by Perkel [ 9 ], are entirely dedicated to these functions, with integrated reservation calendars, administration of equipment status and availability, a repository of maintenance documentation, and a registry of use time. Other all-purpose management systems such as Labguru have most of these functions on a specific equipment module. That was also the case of the freely available (for individual researchers) Quartzy until 2016, as reported by Timóteo et al. [ 6 ]. This study described how the implementation of the software optimized the shared use of equipment on a multiuser clinical research unit and the advantages of allowing equipment scheduling, check-in, and check-out remotely, even using mobile phones.
Several procedures related to pre-clinical studies conducted with animals are issued in the GLP principles, mostly in item #5 of the OECD document (test system) and subsection #5.2 (Biologicals). These include a proper registry of housing, handling, and care of animal test systems to ensure the quality of the data. Additionally, records of source, date of arrival, and arrival condition of test systems should be maintained. Two selected studies described the use of vivarium monitoring software to ensure the remote control of stocking, accommodation, handling and care of animals, identification of colonies, and inventory of supplies.
Milisavljevic et al. [ 8 ] described, in 2010, the Laboratory Animal Management Assistant (LAMA), a software modified from the LIMS proposal to optimize small animal research management. It was initially developed to manage hundreds of new mouse strains generated by an extensive functional genomics program in Canada. The authors realized that they needed greater availability of suitable, easy-to-use systems and software interfaces. LAMA was implemented for a broad community of users, allowing individual research labs to track their colonies in a larger facility, independently. This open-access software is still available to the research community.
Allwood et al. [ 23 ] described, in 2015, how smartphones could help researchers in the remote management of animal colonies. The authors proposed Lennie, an app that introduced a new method for managing small to medium-sized animal colonies, allowing users to remotely access the facilities, and create and edit several functions virtually from anywhere. Its use contributes to the optimization of workflow and planning of experiments, offering a user-friendly experience. Possible updates to the functionalities were also suggested, such as camera integration with the calendar, permission for data sharing, and permanent storage.
In order to comply with the GLP standards, samples that arrive at a laboratory must have records that include the characterization and reference, date of receipt, expiration date, quantities, and storage data, following item #6.1 (receiving, handling, sampling, and storage). This issue is of utmost importance for managing biobanks and biorepositories, creating a need for specific software for successful management.
Boutin et al. [ 25 ] carried out a study on a complex system of various software that contributed to the management of a Biobank. The core object of management was an extensive repository of samples and data available to researchers. The platform requires robust software and hardware, as they work with large amounts of data stored and transferred to research groups. In the study, the authors described each of the five custom and commercially available information systems integrated into the existing clinical and research systems, and discuss safety, efficiency, and challenges inherent in the construction and maintenance of this infrastructure. Constrack was used to manage patient data. The Enterprise Master Specimen Index (EMSI) is a sample indexing system, STARLIMS manages inventory, GIGPAD manages data and integrates equipment, and the Biobank Portal is the customized application that connects all the systems.
Manca et al. [ 28 ] assessed the structure of a central laboratory of the Antibacterial Resistance Leadership Group (ARLG) in the USA. This group leads the evaluation, development, and implementation of laboratory-based research and supports standard or specialized laboratory services. The laboratory included both a physical and a virtual biorepository. They developed digital procedures for reviewing and approving strain requests, providing guidance during the selection process, and monitoring the transfer of strains from the distribution laboratories to the requesting investigators.
Paul et al. [ 29 ] also describe a Biobank management system, with great emphasis on data storage in clouds. The authors evaluated that biobanks have become an essential resource for health research and drug discovery. However, collecting and managing large volumes of data (bio-specimens and associated clinical data) requires biobanks to use more advanced data management solutions. Paul and Chatterjee [ 27 ] point out that in the current COVID-19 pandemic scenario, that requires global and quick actions, virtual biobanks present a crucial role in several different fronts, from diagnosis to research. Without the need to physically use biological samples, these banks may allow sharing medical data and networks for better cooperation between biobanks at the national and international levels.
Recently, Dennert, Friedrich, and Kumar [ 1 ] explained the various implications of the inventory management of biological samples from various research areas, employing different cryopreservation methods. Such management must ensure the availability of items, easy tracking, and the optimization of shared space among the various research groups. For this, the authors presented the various stages of developing an inventory data model using the Microsoft Access database, after several phases that included training, planning, implementation, and maintenance, as well as the establishment of manuals and protocols for standardized data entry. Using the software development lifecycle (SDLC), the authors attained a database construction model. This model requires frequent communication with users to provide transparency and quality improvement.
Identifying incidents and risk assessment is an essential part of the GLP standards that requires an adequate work plan and a quality assurance program (OECD document item #2). Item #8.3 of the GLP states that all data changes during the conduction of a study must always be registered and responsible for the change to ensure traceability, enabling a complete audit trail to show all changes without masking the original data.
The work of Dirnagl et al. [ 27 ] discusses how error management is fundamental to comply with international standards while studying the implementation of the LabCIRS (Laboratory Critical Incident Reporting System), a simple, accessible, and open-source critical incident reporting system for pre-clinical and basic academic research groups. The software was implemented by establishing an electronic quality management system, which allowed accessibility through any laboratory computer, enabling incident reports that included photo uploads and automatic alerts for new reports and archiving.
Item #6.2 of the GLP principles clearly states that all material from a study must be adequately identified, including the batch number, purity, composition, concentrations, or other characteristics, to define each item or reference item properly. It also indicates the need to keep the receipt and expiration dates, quantities received/used in the studies, and storage instructions for the stock of materials. In this review, several articles emphasized this need to monitor inventories with the help of computerized systems.
Nayler and Stamm [ 17 ], in 1999, described a laboratory management software, ScienceLab Database (SLD), which offered a management platform for molecular biology research laboratories. The program primarily manages the stock of biological samples, including plasmids, antibodies, cell lines, and protocols, and included an ordering and grants management system. The authors considered that this system met the specific needs of a small to medium-sized research laboratory, helping to organize inventories of valuable reagents, storing, and maintaining information about these items, and simplifying orders and processes.
By 2016, Catena et al. [ 26 ] developed the AirLab, a cloud-based tool with web and mobile interfaces, to organize antibody repositories and their multiple conjugates. Due to the large number of data generated by these collections, the authors recognized the need for dedicated software. The work demonstrated that Airlab simplifies the purchase, organization, and storage of antibodies, creating a panel to record results and share antibody validation data.
Yousef et al. [ 21 ] described the LINA (Laboratory Inventory Network Application) as a set of Windows-based inventory management software configured to work on a computer network with multiple users. Designed for small molecular biology laboratories, it uses Access databases to assign a new identifier to each new reagent, providing a library that helps with research and comparing DNA sequences. It later faced several features, such as expanding the types of tables available, compatibility with other operating systems, barcoding, and improvement of security issues. According to the authors, the resources provided by LINA are comparable to those available in commercial databases, with the advantage of providing a free database maintenance application for academic laboratories.
In an opinion article published in Nature’s section “Toolbox”, Perkel [ 9 ] describes several low-cost computerized electronic inventory systems as a means to overcome tortuous searches, old notebooks, out-of-date spreadsheets, and “frost-encrusted freezer boxes” to identify laboratory samples and resources. Besides programs discussed by other authors in this review, such as LINA and Quartzy, the article cites other systems such as OpenFreezer, a free web-based system to register sample data such as location, origin, and biological properties, the cloud-based StrainControl (DNA Globe, Sweden), a software free for individual researchers that provides support for managing different lab-organism strains, molecules, and chemicals, the mLIMS, developed by BioInfoRx (Madison, WI, USA), designed to track rodent colonies, LabGuru (BioData, Cambridge, MA, USA), a widely known paid cloud-based all-in-one Electronic Notebook, and CISPro (BioVia, Waltham, MA, USA), described as a functional Institute-wide tracking system for shared resources. Despite differences in accessibility and several resources, all of these systems share similar search engines linked to customizable databases.
Timoteo et al. [ 6 ] evaluated, by 2020, the impact of implementing a multi-module, free-of-charge online management system (Quartzy, Quartzy Inc., Santa Clara, CA, USA) in the workflow of a Brazilian academic clinical research laboratory on the perception of users. Until 2016, the software modules could assist in various aspects and demands of the laboratory, including user communications, multiuser equipment management, material inventory, research documents, and tracking of supply orders. Unfortunately, Quartzy was recently updated to a simpler version, consisting only of an inventory and purchase tracking system that connects researchers to hundreds of life sciences brands and suppliers.
Effectiveness is a fundamental point to be considered in the potential role of software for laboratory management. However, most of the eligible studies identified in our search did not investigate the reported systems’ impact either through qualitative or quantitative assessments. Moreover, despite the performance of evaluations, few studies identified or discussed the limitations and drawbacks of the studied information systems. The studies with evaluations reported, among several aspects, improvement of the organization, workflow, traceability, reliability, acceptability, and good use of the software. Decreased process errors were reported that were made manually, thereby gaining productivity and reducing work. In some specific cases, they positively evaluated the control of frozen cells, generating efficiency and better results in partner laboratories. On the other hand, regarding limitations, older articles (before 2000) identified problems that were more related to system performance, which was sometimes slow and needed adjustments at a time when information technology was still incipient. The limitations from the most current systems are more related to a selective satisfaction and acceptance of software tools, specific according to the function and objective of each group and, in some cases, the resistance by researchers and staff to abandon old ways and migrate to digital tools, which were not used to their full potential within the laboratory.
To adequately assess the impact of these electronic management systems, different methodological approaches are available, such as pre/post-tests evaluating quantitative indicators of performance and provision of services. However, as Timoteo et al. [ 6 ] discussed, the complex nature of the provided services of multiuser, academic research facilities may impair the obtention of feedback through quantitative indicators. In this sense, the perception and attitudes of staff towards the management system may contribute to understanding its impact on the workflow and the search for quality at academic clinical research laboratories, as well as provide data for the development or improvement of actions and strategies toward quality and compliance [ 31 , 32 , 33 ]. In this sense, validated tools may provide a means to standardize the evaluation of laboratory management software, allowing comparisons on the effectiveness and adequacy of these systems in different applications. Two studies [ 18 , 21 ] proposed the use of an important tool to investigate the effectiveness and efficiency of the software, the system usability scale (SUS). This tool, developed by John Brooke at Redhatch Consulting (UK), consists of a simple, ten-item attitude questionnaire using a Likert scale to provide a global view of subjective assessments of usability, which was validated as providing reliable results even with small samples/study groups, which was the case of most identified studies in this review. Therefore, it may represent a potential tool (although underestimated until the present moment) for further studies on implementing laboratory management systems.
Different studies point out that staff training is one of the most important factors of success of the implementation of these systems and a key part in acceptance and adapting to a new management model. Dirnagl et al. [ 27 ] evaluated the impact on staff attitudes toward incident reporting after one year of implementation, observing that training led to greater adherence to the goal of complying with international quality standards and mature culture of error management. Timóteo et al. [ 6 ] performed a qualitative evaluation of the staff perception on software implementation, where most users stated that constant training and leadership were pivotal for the successful use of the software. On the other hand, Anderson et al. [ 19 ] reported that limited access to training was a barrier to software use during the implementation of MGEA, and that the lack of ongoing training might have contributed to a progressive de-emphasizing of the system use among the laboratory staff. These data point to the need of careful planning by the PIs to ensure continuous and inclusive training on the implementation program of management systems.
Regarding availability and accessibility, until 2010, most of the identified programs had to be downloaded/installed to specific laboratory computers [ 19 , 30 ], but were sometimes able to integrate local area networks (LANs), as described by Delorme and Cournoyer in 1980 [ 15 ]. In the past decade, technology has advanced to online software, expanding even to applications (apps) on mobile phones, reflecting the current expectations of users and consumers. With app technology permeating all fields of our daily lives, it would be natural for this technological paradigm to reach laboratory and research technologies. Indeed, a big leap was identified towards the proper integration between lab management systems and the new mobile universe. Real-time communication makes it possible, for example, that inventory checks, equipment scheduling, and data verification of an animal colony be performed while in transit. Multicenter studies can share data in real-time, as recently observed in the fast development studies of vaccines against SARS-CoV-2 since 2020, relying heavily on technological development and efficient data management [ 34 ].
Begg et al. [ 35 ] discussed how computer systems are of particular importance in the process of GLP certification in low- and middle-income countries, even though their role is not always emphasized on accreditation systems around the world. This review identified that the knowledge on laboratory management software is mainly originated, as expected, from developed, high-income countries, with advanced information technology industries and significant investment in technology and support for universities and study centers (USA, Germany, Canada, United Kingdom, Switzerland). In a critical view, it may indicate an economic bias in the technological development on the theme, as developing countries maintain a role as consumers of technology and not as producers and developers, reflecting little investment in this (and other) technological areas.
The costs of implementing computerized systems may represent one of the main challenges for public Academic Health Centers since these Institutions, in general, face tight budgets to support several laboratories, researchers, and research lines. Such limitations are expected to be potentialized when considering low- to middle-income countries, which could benefit from low-cost or cost-free initiatives.
In general, the development and maintenance of information systems are made possible by providing subscription services to ensure the tool’s sustainability. The present review identified some systems that addressed a full spectrum of fundamental issues in the management of academic laboratories, such as inventory control and organization and equipment scheduling, on a free-of-charge basis, as it incorporated catalogs from various sponsors (reagent suppliers) and suggests these products when orders are placed [ 9 ]. However, such a business model probably did not match the maintenance costs of the platform, as Quartzy has shut down all functions not related to inventory/purchases by 2016, and recently included a fee for Institutional users. It is also possible that users from outside the USA and Europe could not use the vendor-related functionalities, as customer services and representatives in regions such as South America would not connect directly to the system [ 6 ]. On the other hand, LINA is an example of a system that could remain free-of-charge, even though limited to the needs of small molecular biology laboratories [ 21 ], with much simpler functionalities compared to well-known commercial applications such as Labguru. Other services, such as QReserve, have both free and paid versions with increased functionalities, allowing low-budget academic laboratories to use some free resources, such as equipment reservation and management, through a more straightforward interface.
A usual profile among entirely free software originates from in-house academic software, such as Biobank Portal and CCLMS, customized for the personal use of the developer group, usually without widespread use in other institutions. Even though they may present advantages on issuing specific demands of developers, the lack of a profound, systematic evaluation of performance on most selected studies does not allow to infer whether these are more or less effective than commercial software. In this sense, Boutin et al. [ 25 ] report that the laboratory IT framework may face challenges common to industry settings, where cost-overrun is prevented by planning the cost-effectiveness of purchasing commercially available vs. designing in-house custom applications. An interesting way to achieve broader applicability for such software is to use open-source codes, such as Boutin et al. [ 25 ], paving the way for other programmers to adapt the tool to different laboratory specificities. It is important to notice that investments from government bodies worldwide could also contribute to the development of freely available tools as part of public policies focused on increasing overall quality and adherence to good practices in health sciences research. In this sense, the encouragement of startups involving interdisciplinary initiatives can turn universities and academic centers into important stakeholders in covering technological gaps in low- or middle-income countries [ 36 ].
The present Scoping Review has limitations mainly related to the impossibility of exhausting the literature on laboratory software, reflected in the choice of not including programs that dealt only with the transmission and handling of analysis results and laboratory data, such as pure LIMS or analytical bioinformatics software. Despite their fundamental role, these types of software have already been widely discussed [ 37 , 38 , 39 , 40 ], and most of these systems were not designed to support the management of staff and shared resources, for example. Additionally, the scientific literature probably does not reflect the abundance of available software since developers and the scientific community usually treat them as a commercial tool rather than a research topic. Nevertheless, regardless of such limitations, the present review was able to map a framework that points to the great applicability of these systems in the search for quality and good practices in academic experimental medicine laboratories, where restrictions regarding the availability of resources and staff and limited management experience are common restrictions. Therefore, the gaps identified here can serve as an indication for new studies that seek to assess, quantitatively or qualitatively, the impact of implementing these tools on the best practices at academic health Institutions.
The present literature review mapped several studies in the last four decades, proposing and evaluating the impact of digital tools in the management of health sciences research laboratories to several different applications, ranging from administrative workflow management and data traceability to virtual biobanking. These functions have the potential to contribute to the adherence to different GLP principles. However, the evidence for their effectiveness is still limited and requires further investigative efforts.
The authors acknowledge the financial support in scholarships from the Brazilian agencies CNPq, CAPES, and FAPERJ. The authors acknowledge the technical support by Jean Carlos Nascimento.
The following are available online at https://www.mdpi.com/article/10.3390/healthcare9060739/s1 , Table S1: Preferred Reporting Items for Systematic reviews and Meta-Analyses extension for Scoping Reviews (PRISMA-ScR) Checklist.
Conceptualization, G.A., M.T. and B.O. (Bruna Oliveira); methodology, G.A. and C.F.d.A.B.M.; software, P.M.; formal analysis, M.T., R.B., J.d.S. and L.D.; investigation, M.T., E.L., J.d.S. and G.A.; resources, P.M. and C.F.d.A.B.M.; data curation, C.F.d.A.B.M. and G.A.; writing—original draft preparation, M.T., E.L., A.C.B. and L.D.; writing—review and editing, G.A. and C.F.d.A.B.M.; project administration, B.O. (Beni Olej). All authors have read and agreed to the published version of the manuscript.
This research received no external funding.
Not applicable.
Data availability statement, conflicts of interest.
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
New tool helps predict progression of alzheimer’s.
UTA’s framework can pinpoint clinical status of individuals within the disease spectrum
UTA researcher earns grant to integrate computer vision into Air Force maintenance training
UTA postdoc attends intensive program that includes hands-on sessions with supercomputers
UTA researcher working to help military use civilian 5G networks securely
UTA doctoral student earns prestigious scholarship from Department of Defense
Our department has a number of teaching and research laboratories, centers and other facilities that support its educational and teaching mission. Its internationally recognized faculty members are engaged in breakthrough research across the leading areas of computer science and engineering.
Mission: At the ACES Lab, we focus on modeling, analysis, and resource management for large-scale parallel and distributed computing systems. In particular, we are interested in developing highly scalable, user-centric resource allocation solutions for cloud and edge computing. To this end, user-utility-based, distributed, joint compute-network-and-storage resource allocation solutions with provable convergence and optimality properties are sought. [Lead Faculty: Hong Jiang, Hao Che]
Mission: The Adaptive and Scalable Systems Lab focuses on building computer systems that are adaptive to changing workloads, scalable for platform growth, and capable of providing quality-of-service guarantees and service differentiation. We combine performance analysis at the application, OS, and hardware levels with machine learning techniques to characterize the complex behaviors of computer systems. [Lead Faculty: Jia Rao]
Mission: This lab seeks to advance knowledge on natural language processing and knowledge engineering, with particular interest in the interdisciplinary research that empowers other scientific fields. [Lead Faculty: Kenny Zhu]
Mission: The ASSIST Laboratory's focus is on researching and developing technologies to assist the elderly and people with disabilities in their everyday life. [Lead Faculty: Farhad Kamangar, Manfred Huber, David Levine, Jason Losh]
Mission: The Autonomous and Intelligent Systems Lab operates in partnership with the Automation and Intelligent Systems Division of the UT Arlington Research Institute’s Division of Automation and Intelligent Systems. We conduct research in machine learning, robotics and autonomous systems to solve real-world problems. Focuses include navigation and control algorithms for a wide variety of autonomous vehicles, assistive robotics for people with disabilities and mobility impairments, computer vision for environmental monitoring, and learning from demonstration for safe cooperative robotics. [Lead Faculty: Nicholas Gans]
Mission: Developing efficient algorithms to solve computational problems in basic medicine and clinics, while making theoretical and fundamental contributions to machine learning, data mining, pattern recognition, and computer vision. [Lead Faculty: Jean Gao]
Mission: At Cyber Guard Research Lab, our mission is to safeguard emerging platforms, medical applications, web frameworks and other widely-used technologies by uncovering software vulnerabilities and developing robust defense solutions. Through the utilization of cutting-edge tools powered by program analysis, machine learning, natural language processing, and LLM, we diligently identify potential threats and propose innovative strategies to fortify the cyber-physical ecosystem. Our commitment lies in ensuring the safety and security of these platforms, ultimately contributing to a more resilient digital landscape. [Lead Faculty: Faysal Hossain Shezan]
Mission: Develop novel methods to enable information security and privacy in modern communication systems that are robust against computationally unbounded adversaries, with a focus on solutions that are resistant against quantum-capable adversaries. [Lead Faculty: Remi Chou]
Mission: At Database Exploration Lab (DBXLab), we seek to investigate fundamental research issues arising in Big Data. Our research encompasses diverse areas such as data mining, information retrieval, data uncertainty and probabilistic methods, approximate query processing, data summarization, data analytics and data exploration of hidden web databases, social and collaborative media. [Lead Faculty: Gautam Das]
Mission: The DDL supports the design and prototyping of digital systems for educational and special purpose applications employing legacy and current design tools and implementation technologies. [Lead Faculty: Bill Carroll]
Mission: building computational tools and frameworks that at massive scale allow for cancer imaging data to be 1) contextualized in the oncology clinic to improve patient outcomes and 2) leveraged at the bench to augment drug discovery efforts. The lab also focuses on developing computational and statistical methods for handling high throughput `omics data such as single cell transcriptomics/spatial transcriptomics (10X Visium & Chromium), spatial proteomics (CODEX), and calcium imaging. [Lead Faculty: Jacob Luber]
Mission: The mission of the Heracleia (heracleia.uta.edu) lab computational innovations in the areas of Human Computer Interaction (HCI), Pervasive Computing, healthcare services for disabilities, data analytics for behavior monitoring applications, assistive robotics, and computer aided rehabilitation. [Lead Faculty: Fillia Makedon]
Mission: The Hybrid Atelier is a creative technology research makerspace, serving as a nexus between the Arts, Engineering, and Sciences, with the mission of re-imagining, inventing, and supporting what creativity and making will look like 20 years from now. The atelier focuses its efforts in the areas of Human-Computer Interaction, Design, Physical Computing, Digital Fabrication, and Augmented Environments. [Lead Faculty: Cesar Torres]
Mission: The Immersive Efficient Computing and Communication (IMEC 2 ) lab is dedicated to developing efficient technologies for future immersive computing and communication. Specifically, we have been focusing on mobile systems and networks, including virtual, augmented, and mixed reality (VR, AR, and MR), digital twins, etc., over 4G/5G and beyond. Our overarching mission is to enable technologies for mobile systems and networks in another decade, extending beyond form factors such as VR/AR headsets. [Lead Faculty: Jiayi Meng]
Mission: The mission of this lab can be summarized as: a) carry out both fundamental and practically applicable research & development, b) interact and collaborate with industry/federal agencies for identifying fundamental problems, and c) provide a viable migration path for integrating new techniques/solutions into real-world systems/applications. [Lead Faculty: Sharma Chakravarthy]
Mission: The Innovative Data Intelligence Research (IDIR) Laboratory conducts research in several areas related to big data intelligence and data science, including data management, data mining, natural language processing, applied machine learning, and their applications in computational journalism. The lab's current research focuses on building large-scale human-assisting and human-assisted data and information systems with high usability, high efficiency and applications for social good. Particularly, the ongoing research projects include data-driven fact-checking, exceptional fact finding, fake-news detection, usability challenges in querying and exploring graph data, knowledge databases, and data exploration by ranking (top-k), skyline and preference queries. The lab started the inter-disciplinary research in computational journalism at UTA in 2010 and has since been at the frontier of this nascent field. [Lead Faculty: Chengkai Li]
Mission: We focus on customizing machine learning and computer vision techniques to solve various clinical problems including computer aided diagnosis using multi-modal and multi-source datasets such as medical imaging, EHR and clinical reports data collected from different institutes or hospitals; smart patients surveillance system using multiple censoring dataset collected from cameras, wifi and radar. [Lead Faculty: Yingying Zhu]
Mission: Our research in MIND mainly focuses on the discovery of fundamental principles of brain structural and functional architectures and their relationship, via brain imaging, computational modeling and machine learning methods. We are interested in the interaction between Artificial Intelligence (AI) and Human Intelligence (HI): Using Deep Learning to facilitate the analysis and interpretation of brain data; Applying neuroscience knowledge to design more efficient Deep Learning architectures. We also have strong interests in applying the discovered principles, theories and methods to better understand neurodevelopmental, neurodegenerative and psychiatric disorders including Autism, Alzheimer’s disease, and Major Depression, among other brain conditions. [Lead Faculty: Dajiang Zhu]
Mission: The mission of the lab is to develop effective mechanisms to address security/privacy challenges, improve system performances, and explore novel applications in mobile computing. [Lead Faculty: Ming Li]
Mission: Our primary objective is to conduct rigorous research and development focusing on integrating software and hardware co-design principles in the AI ecosystem. The aim is to create trustworthy, efficient, and adaptable AI models that can operate effectively within the limited resources of on-device platforms, then democratizing AI to real-world on-device applications. Our research draws upon methodologies from mathematical tools, machine learning, computer architecture, and high-performance computing. We specifically build the general framework of software/hardware co-design for efficient and trustworthy AI based on higher-order tensor decomposition and optimization algorithms, then apply the corresponding AI models to on-device applications. [Lead Faculty: Miao Yin]
Mission: Today’s computer systems have been continuously evolving to catch up with the demands of modern society. The technological progress is stretching the boundaries of what is possible, creating new unprecedented operational challenges. To that end, we focus on enhancing computer systems with secure and efficient designs through rigorous investigation and evaluation. [Lead Faculty: Mohammad Atiqul Islam]
Mission: The focus of the RVL is on the challenge of applying computer vision to robotics and automation. We believe that the ability to visually perceive, understand, and respond to the complex world around us is crucial for the next generation of robots in manufacturing, transportation, construction, infrastructure inspection, environmental monitoring, agriculture, healthcare, space exploration, defense, and the home. [Lead Faculty: William Beksi]
Mission: In SMILE, we focus on developing scalable models and algorithms for data-intensive applications in high performance computing. In particular, we focus on advanced algorithms, software and systems for statistical learning, imaging informatics and computer vision. Our interest is to develop efficient algorithms with nice theoretical guarantees to solve practical problems involved large scale data. [Lead Faculty: Junzhou Huang]
Mission: In this lab, we conduct data-driven research to understand online security, privacy and safety, and to develop novel techniques and frameworks for improving them. Our research is interdisciplinary and spans widely across multiple areas, from data science and social computing research to traditional security and privacy research. [Lead Faculty: Shirin Nilizadeh]
Mission: The mission of the Software Engineering Research Center (SERC) is to conduct cutting-edge research in various areas of software engineering, including software design, specification, analysis, verification, and testing. [Lead Faculty: Christoph Csallner, David C. Kung, Yu Lei, Allison Sullivan]
Mission: At the Transformative Wireless Systems and Technology (TWiST) Lab, our commitment is to lead the way in pioneering groundbreaking advancements in wireless communication and systems. Through innovative research initiatives spanning areas such as Networked Robotics, Spectrum Learning, Open-RAN, and beyond, we aspire to redefine the capabilities of wireless systems. Our multidisciplinary team operates at the forefront of technology, integrating AI, ML, AR/VR, and other cutting-edge fields to develop transformative solutions. With a steadfast focus on practical applications and tangible real-world impact, we endeavor to shape the future of wireless technology and facilitate its seamless integration with intelligent robotics. [Lead Faculty: Debashri Roy]
Mission: The VLM lab is a research lab at the Computer Science and Engineering Department of the University of Texas at Arlington. At the VLM lab we are conducting research in the areas of computer vision, machine learning, and data mining. Areas of focus include gesture and sign language recognition, human motion analysis, detection and tracking of complex shapes, large-scale multiclass recognition, and similarity-based retrieval and classification using large databases. [Lead Faculty: Vassilis Athitsos]
Mission: The WINS lab, directed by Dr. Yonghe Liu, focuses on challenging issues in wireless networks. The group's research spans both theoretical study and practical system design and development. Our current research projects include novel architecture for sensor networks, routing and buffer management for delay tolerant networks, networking issues in opportunistic networks, and mobile social networks. [Lead Faculty: Yonghe Liu]
Our faculty conduct research in six general areas: Artificial Intelligence, Big Data and Data Science, Computer and Network Systems, Human-Computer Interfaces, Security, and Software Engineering. Each general area is divided into more specific focus areas.
Box 19015 Arlington, TX 76019
CSE Department c/o UTA Central Receiving Box 19015 1225 West Mitchell Arlington, TX 76019
Jump to a Section
Artificial intelligence group, assistive and rehabilitation robotics laboratory, campanoni lab, cognition, creativity and communication (c3), habits lab - mobile health and passive sensor analytics, ideas - design automation of intelligent systems, interactive audio lab.
Northwestern networks group.
Prescience lab, qualitative reasoning group, swarm robotics lab, tangible interaction design and learning (tidal) lab, technological innovations for inclusive learning & teaching (tiilt), northwestern cs theory group, vlsi research lab.
The AquaLab conducts research on large-scale distributed systems and networking. Its approach is mainly experimental, focusing on the design, deployment, and evaluation of systems that people use. This lab is headed by Fabián Bustamante .
Return to Top
Artificial Intelligence (AI) research explores the nature of intelligence and the ways in which computation can be used to explain and engineer it. Work in AI combines the scale afforded by machine learning with the expressive and organizational power of semantic information-processing and knowledge-based reasoning. Our faculty research programs include work in machine learning, cognitive modeling, language understanding and generation, planning and reasoning, robotics and human-robot interaction, computational journalism, social media analysis, computer audition, computers and education, computational creativity, and legal reasoning.
The Assistive and Rehabilitation Robotics Laboratory strives to advance human ability through robotics autonomy by easing the burden of controlling assistive machines. This lab is headed by Brenna Argall .
Research focuses on code compilation challenges for both energy efficiency and performance on commodity processors. These challenges are addressed by co-designing compilers, the computer architecture of the platform they target and programming languages. This lab is headed by Simone Campanoni .
The C3 Lab is focused on bridging the gap between the wealth of data that surrounds us and the information that is hidden within it. Using the tools of natural language processing, data analytics, and cognitive modeling the C3 Lab builds systems that link people to the information and insight that serves them. C3 is led by Kristian Hammond .
The Delta Lab is an interdisciplinary research lab and design studio at Northwestern University. Its driving mission is to improve the way we design, work, learn, play, and fundamentally change the way we interact. This lab is headed by Haoqi Zhang.
We design, build and analyze end-to-end mobile health systems, while focusing on signal processing and machine learning of wearable sensor data to help answer health-related questions. This lab is headed by Nabil Alshurafa.
IDEAS Lab aims to develop innovative design automation methodologies and algorithms for software synthesis of cyber-physical systems (CPS), which have applications in key sectors such as automotive, aerospace, healthcare, and industrial automation.
The Interactive Audio Lab applies machine learning, signal processing, natural language processing, and database search techniques to make new auditory tools and interfaces. This lab is headed by Bryan Pardo .
We are a research lab working at the intersection of information visualization and uncertainty communication. Our mission is to combat misinterpretations and overconfidence in data by developing visual representations and human-in-the-loop tools that express uncertainty and align with how people think. Topics we like include sampling-oriented uncertainty visualizations, eliciting and modeling beliefs about data, automated visualization reasoning and generation, and Bayesian statistics. The MU Collective is directed by Jessica Hullman and Matt Kay .
Return to Top
The Northwestern Lab for Internet and Security Technology works to improve the reliability, agility, and security of the current Internet, including wireless networks. This lab is headed by Yan Chen .
The Northwestern Networks Group conducts research on computer networking. This lab is headed by Aleksandar Kuzmanovic .
Northwestern Security and Privacy Group has a group of faculty & researchers working on software/system security, network security, AI security, cryptography, and data privacy.
Human intelligence lies on the capability of thinking. With the development of the programmable computer, slow thinking has found its solid foundation on computation. With the recent progress of machine learning, especially deep learning, fast thinking is discovering its root on learning. Logics is the foundation of both computation and learning. NuLogiCS group's goal is to build the new logic foundation for Artificial Intelligence (which needs a tight integration between computation and learning), and for Security (which needs a tight integration between cryptography and formal verification).
The Parallel Architecture Group at Northwestern (PARAG@N, pronounced “ paragon ”) conducts research in energy-efficient high-performance parallel computing at all scales through cross-layer design: from emerging devices and circuits, to computer architecture, compilers, runtimes, operating systems, and applications. This lab is headed by Nikos Hardavellas.
The Prescience Lab conducts a range of experimental computer systems research with a current focus on virtualization, empathic systems, and parallel and distributed systems. This lab is headed by Peter Dinda .
The research of this group includes efforts to both create new kinds of cognitive systems and model human cognition. It has strong collaborations with other cognitive scientists in a number of fields at various institutions. This group is headed by Kenneth D. Forbus .
Research interests interests include advancing the control and design of multi-robot systems, enabling their use instead of traditional single robots and to solve problems for which traditional robots are not suitable. Using these multi-robot systems can offer more parallelism, adaptability, and fault tolerance when compared to a traditional single robot. The lab is interested in investigating how new technologies will allow for more capable multi-robot systems and how these technologies impact the design of multi-robot algorithms, especially as these systems begin to number in the hundreds, thousands, or even millions of robots. This lab is headed by Michael Rubenstein .
TIDAL Lab is a team of designers, artists, learning scientists, and computer scientists at Northwestern University. Its research creates and studies innovative technology-based learning experiences. This lab is headed by Michael Horn .
The Technological Innovations for Inclusive Learning and Teaching (TIILT) Lab aims to improve learning opportunities for students from under-served communities. This lab is headed by Marcelo Worsley .
Theoretical computer science looks at fundamental questions about computation by creating formal models of computation and understanding the resources needed to solve general and specific algorithmic questions. TCS studies the design of efficient algorithms and understanding the computational complexity of various computational tasks that arise in computer science, statistics, economics and the other sciences.
The major research areas include design and analysis of algorithms, computational complexity, randomness in computation, combinatorial optimization, approximation algorithms, online algorithms. The theory group at Northwestern also has strong interests in using computation as a fundamentally new lens to study other fundamental sciences, leading to areas of algorithmic game theory, machine learning and bioinformatics.
The VLSI Research Lab at NU focuses on the areas of energy efficient computing with digital and mixed signal design approaches. We are dedicating our effort to explore new computing method for many emerging applications including energy efficient edge computing, accelerator designs for artificial intelligence, etc.
Larry Birnbaum Professor Phone: 847-481-5258 Email Larry
The Computer Systems Research Laboratory is a research group made up of faculty members and students in the Department of Computer Science and Engineering at the University of North Texas. While we are interested in many topics within computer science, our current research focuses mainly on 3D RAM design, Processing-in-Memory and memory analysis tools. We are also investigating hardware and system level security enhancements, ontologies for capturing and analyzing security vulnerabilities and hardware assistance for managing complex indexing for neural networks, graph analytics and sparse data.
Suggestions or feedback?
Press contact :, media download.
Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license . You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."
Previous image Next image
Quantum computers hold the promise of being able to quickly solve extremely complex problems that might take the world’s most powerful supercomputer decades to crack.
But achieving that performance involves building a system with millions of interconnected building blocks called qubits. Making and controlling so many qubits in a hardware architecture is an enormous challenge that scientists around the world are striving to meet.
Toward this goal, researchers at MIT and MITRE have demonstrated a scalable, modular hardware platform that integrates thousands of interconnected qubits onto a customized integrated circuit. This “quantum-system-on-chip” (QSoC) architecture enables the researchers to precisely tune and control a dense array of qubits. Multiple chips could be connected using optical networking to create a large-scale quantum communication network.
By tuning qubits across 11 frequency channels, this QSoC architecture allows for a new proposed protocol of “entanglement multiplexing” for large-scale quantum computing.
The team spent years perfecting an intricate process for manufacturing two-dimensional arrays of atom-sized qubit microchiplets and transferring thousands of them onto a carefully prepared complementary metal-oxide semiconductor (CMOS) chip. This transfer can be performed in a single step.
“We will need a large number of qubits, and great control over them, to really leverage the power of a quantum system and make it useful. We are proposing a brand new architecture and a fabrication technology that can support the scalability requirements of a hardware system for a quantum computer,” says Linsen Li, an electrical engineering and computer science (EECS) graduate student and lead author of a paper on this architecture.
Li’s co-authors include Ruonan Han, an associate professor in EECS, leader of the Terahertz Integrated Electronics Group, and member of the Research Laboratory of Electronics (RLE); senior author Dirk Englund, professor of EECS, principal investigator of the Quantum Photonics and Artificial Intelligence Group and of RLE; as well as others at MIT, Cornell University, the Delft Institute of Technology, the U.S. Army Research Laboratory, and the MITRE Corporation. The paper appears today in Nature .
Diamond microchiplets
While there are many types of qubits, the researchers chose to use diamond color centers because of their scalability advantages. They previously used such qubits to produce integrated quantum chips with photonic circuitry.
Qubits made from diamond color centers are “artificial atoms” that carry quantum information. Because diamond color centers are solid-state systems, the qubit manufacturing is compatible with modern semiconductor fabrication processes. They are also compact and have relatively long coherence times, which refers to the amount of time a qubit’s state remains stable, due to the clean environment provided by the diamond material.
In addition, diamond color centers have photonic interfaces which allows them to be remotely entangled, or connected, with other qubits that aren’t adjacent to them.
“The conventional assumption in the field is that the inhomogeneity of the diamond color center is a drawback compared to identical quantum memory like ions and neutral atoms. However, we turn this challenge into an advantage by embracing the diversity of the artificial atoms: Each atom has its own spectral frequency. This allows us to communicate with individual atoms by voltage tuning them into resonance with a laser, much like tuning the dial on a tiny radio,” says Englund.
This is especially difficult because the researchers must achieve this at a large scale to compensate for the qubit inhomogeneity in a large system.
To communicate across qubits, they need to have multiple such “quantum radios” dialed into the same channel. Achieving this condition becomes near-certain when scaling to thousands of qubits. To this end, the researchers surmounted that challenge by integrating a large array of diamond color center qubits onto a CMOS chip which provides the control dials. The chip can be incorporated with built-in digital logic that rapidly and automatically reconfigures the voltages, enabling the qubits to reach full connectivity.
“This compensates for the in-homogenous nature of the system. With the CMOS platform, we can quickly and dynamically tune all the qubit frequencies,” Li explains.
Lock-and-release fabrication
To build this QSoC, the researchers developed a fabrication process to transfer diamond color center “microchiplets” onto a CMOS backplane at a large scale.
They started by fabricating an array of diamond color center microchiplets from a solid block of diamond. They also designed and fabricated nanoscale optical antennas that enable more efficient collection of the photons emitted by these color center qubits in free space.
Then, they designed and mapped out the chip from the semiconductor foundry. Working in the MIT.nano cleanroom, they post-processed a CMOS chip to add microscale sockets that match up with the diamond microchiplet array.
They built an in-house transfer setup in the lab and applied a lock-and-release process to integrate the two layers by locking the diamond microchiplets into the sockets on the CMOS chip. Since the diamond microchiplets are weakly bonded to the diamond surface, when they release the bulk diamond horizontally, the microchiplets stay in the sockets.
“Because we can control the fabrication of both the diamond and the CMOS chip, we can make a complementary pattern. In this way, we can transfer thousands of diamond chiplets into their corresponding sockets all at the same time,” Li says.
The researchers demonstrated a 500-micron by 500-micron area transfer for an array with 1,024 diamond nanoantennas, but they could use larger diamond arrays and a larger CMOS chip to further scale up the system. In fact, they found that with more qubits, tuning the frequencies actually requires less voltage for this architecture.
“In this case, if you have more qubits, our architecture will work even better,” Li says.
The team tested many nanostructures before they determined the ideal microchiplet array for the lock-and-release process. However, making quantum microchiplets is no easy task, and the process took years to perfect.
“We have iterated and developed the recipe to fabricate these diamond nanostructures in MIT cleanroom, but it is a very complicated process. It took 19 steps of nanofabrication to get the diamond quantum microchiplets, and the steps were not straightforward,” he adds.
Alongside their QSoC, the researchers developed an approach to characterize the system and measure its performance on a large scale. To do this, they built a custom cryo-optical metrology setup.
Using this technique, they demonstrated an entire chip with over 4,000 qubits that could be tuned to the same frequency while maintaining their spin and optical properties. They also built a digital twin simulation that connects the experiment with digitized modeling, which helps them understand the root causes of the observed phenomenon and determine how to efficiently implement the architecture.
In the future, the researchers could boost the performance of their system by refining the materials they used to make qubits or developing more precise control processes. They could also apply this architecture to other solid-state quantum systems.
This work was supported by the MITRE Corporation Quantum Moonshot Program, the U.S. National Science Foundation, the U.S. Army Research Office, the Center for Quantum Networks, and the European Union’s Horizon 2020 Research and Innovation Program.
Related links.
Previous item Next item
Read full story →
Massachusetts Institute of Technology 77 Massachusetts Avenue, Cambridge, MA, USA
Caltech's 130th commencement ceremony, astronomy on tap, behind the book: olympian and caltech athletics director betsy mitchell discusses her book.
VIEW ALL UPCOMING EVENTS
Nasa launches second small climate satellite to study earth's poles link opens in a new tab.
Data from the pair of CubeSats will offer new insights into how much heat the Arctic and Antarctica radiate into space.
Nayla Abney took a chance leaving New Jersey and the East Coast to come to Caltech. She helped launch the inaugural season for women's soccer at Caltech in 2017 and says the sport and the team teach lessons that help her in the classroom and on the field. The chemical engineering major is inspired by the researchers and professors on campus, and she is committed to building a legacy for other young women at Caltech.
For as long as he can remember, David Ignacio Fager has adored mathematics. In high school, he lived for Mu Alpha Theta competitions and skipped ahead in math textbooks the way impatient readers sometimes peek at the last page of a mystery novel. Then came freshman year of college, and Caltech’s social science core introduced him to a new love: economics.
Since his years as a Caltech graduate student, Ralph Adolphs (PhD ’93) has wanted to learn how the biological brain produces the intangible mind, what the mind’s basic elements are, and how the two influence each other.
Katherine de Kleer uses a diverse range of telescopes to observe planets and their moons at radio, infrared, and optical wavelengths. Her innovative approaches for studying this wide swath of frequencies have helped shed light on the seasonal evolution of planetary atmospheres.
Stay up to date on the latest from Caltech.
Get the latest news from the Caltech website delivered to your email inbox.
Quantum computers have the potential to solve complex problems in human health, drug discovery, and artificial intelligence millions of times faster than some of the world’s fastest supercomputers. A network of quantum computers could advance these discoveries even faster. But before that can happen, the computer industry will need a reliable way to string together billions of qubits – or quantum bits – with atomic precision.
Connecting qubits, however, has been challenging for the research community. Some methods form qubits by placing an entire silicon wafer in a rapid annealing oven at very high temperatures. With these methods, qubits randomly form from defects (also known as color centers or quantum emitters) in silicon’s crystal lattice. And without knowing exactly where qubits are located in a material, a quantum computer of connected qubits will be difficult to realize.
But now, getting qubits to connect may soon be possible. A research team led by Lawrence Berkeley National Laboratory (Berkeley Lab) says that they are the first to use a femtosecond laser to create and “annihilate” qubits on demand, and with precision, by doping silicon with hydrogen.
The advance could enable quantum computers that use programmable optical qubits or “spin-photon qubits” to connect quantum nodes across a remote network. It could also advance a quantum internet that is not only more secure but could also transmit more data than current optical-fiber information technologies.
“This could carve out a potential new pathway for industry to overcome challenges in qubit fabrication and quality control.” – Thomas Schenkel, senior scientist, Accelerator Technology & Applied Physics Division
“To make a scalable quantum architecture or network, we need qubits that can reliably form on-demand, at desired locations, so that we know where the qubit is located in a material. And that’s why our approach is critical,” said Kaushalya Jhuria, a postdoctoral scholar in Berkeley Lab’s Accelerator Technology & Applied Physics (ATAP) Division. She is the first author on a new study that describes the technique in the journal Nature Communications . “Because once we know where a specific qubit is sitting, we can determine how to connect this qubit with other components in the system and make a quantum network.”
“This could carve out a potential new pathway for industry to overcome challenges in qubit fabrication and quality control,” said principal investigator Thomas Schenkel , head of the Fusion Science & Ion Beam Technology Program in Berkeley Lab’s ATAP Division. His group will host the first cohort of students from the University of Hawaii in June as part of a DOE Fusion Energy Sciences-funded RENEW project on workforce development where students will be immersed in color center/qubit science and technology.
The new method uses a gas environment to form programmable defects called “color centers” in silicon. These color centers are candidates for special telecommunications qubits or “spin photon qubits.” The method also uses an ultrafast femtosecond laser to anneal silicon with pinpoint precision where those qubits should precisely form. A femtosecond laser delivers very short pulses of energy within a quadrillionth of a second to a focused target the size of a speck of dust.
Spin photon qubits emit photons that can carry information encoded in electron spin across long distances – ideal properties to support a secure quantum network. Qubits are the smallest components of a quantum information system that encodes data in three different states: 1, 0, or a superposition that is everything between 1 and 0.
With help from Boubacar Kanté, a faculty scientist in Berkeley Lab’s Materials Sciences Division and professor of electrical engineering and computer sciences (EECS) at UC Berkeley, the team used a near-infrared detector to characterize the resulting color centers by probing their optical (photoluminescence) signals.
What they uncovered surprised them: a quantum emitter called the Ci center. Owing to its simple structure, stability at room temperature, and promising spin properties, the Ci center is an interesting spin photon qubit candidate that emits photons in the telecom band. “We knew from the literature that Ci can be formed in silicon, but we didn’t expect to actually make this new spin photon qubit candidate with our approach,” Jhuria said.
An artistic depiction of a new method to create high-quality color-centers (qubits) in silicon at specific locations using ultrafast laser pulses (femtosecond, or one quadrillionth of a second). The inset at the top-right shows an experimentally observed optical signal (photoluminescence) from the qubits, with their structures displayed at the bottom. (Credit: Kaushalya Jhuria/Berkeley Lab)
The researchers learned that processing silicon with a low femtosecond laser intensity in the presence of hydrogen helped to create the Ci color centers. Further experiments showed that increasing the laser intensity can increase the mobility of hydrogen, which passivates undesirable color centers without damaging the silicon lattice, Schenkel explained.
A theoretical analysis performed by Liang Tan, staff scientist in Berkeley Lab’s Molecular Foundry, shows that the brightness of the Ci color center is boosted by several orders of magnitude in the presence of hydrogen, confirming their observations from laboratory experiments.
“The femtosecond laser pulses can kick out hydrogen atoms or bring them back, allowing the programmable formation of desired optical qubits in precise locations,” Jhuria said.
The team plans to use the technique to integrate optical qubits in quantum devices such as reflective cavities and waveguides, and to discover new spin photon qubit candidates with properties optimized for selected applications.
“Now that we can reliably make color centers, we want to get different qubits to talk to each other – which is an embodiment of quantum entanglement – and see which ones perform the best. This is just the beginning,” said Jhuria.
“The ability to form qubits at programmable locations in a material like silicon that is available at scale is an exciting step towards practical quantum networking and computing,” said Cameron Geddes, Director of the ATAP Division.
Theoretical analysis for the study was performed at the Department of Energy’s National Energy Research Scientific Computing Center (NERSC) at Berkeley Lab with support from the NERSC QIS@Perlmutter program.
The Molecular Foundry and NERSC are DOE Office of Science user facilities at Berkeley Lab.
This work was supported by the DOE Office of Fusion Energy Sciences.
Lawrence Berkeley National Laboratory (Berkeley Lab) is committed to delivering solutions for humankind through research in clean energy, a healthy planet, and discovery science. Founded in 1931 on the belief that the biggest problems are best addressed by teams, Berkeley Lab and its scientists have been recognized with 16 Nobel Prizes. Researchers from around the world rely on the Lab’s world-class scientific facilities for their own pioneering research. Berkeley Lab is a multiprogram national laboratory managed by the University of California for the U.S. Department of Energy’s Office of Science.
DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States, and is working to address some of the most pressing challenges of our time. For more information, please visit energy.gov/science .
via InfoWorld
Jan. 25, 2024
By Simon Bisson
Back in the early 1990s, I worked in a large telecoms research lab, as part of the Advanced Local Loop group. Our problem domain was the “last mile”—getting services to peoples’ homes. One of my research areas involved thinking about what might happen when the network shift from analog to digital services was complete.
I spent a great deal of time in the lab’s library, contemplating what computing would look like in a future of universal bandwidth. One of the concepts that fascinated me was ubiquitous computing, where computers disappear into the background and software agents become our proxies, interacting with network services on our behalf. That idea inspired work at Apple, IBM, General Magic, and many other companies.
One of the pioneers of the software agent concept was MIT professor Pattie Maes. Her work crossed the boundaries between networking, programming, and artificial intelligence, and focused on two related ideas: intelligent agents and autonomous agents. These were adaptive programs that could find and extract information for users and change their behavior while doing so.
It has taken the software industry more than 30 years to catch up with that pioneering research, but with a mix of transformer-based large language models (LLMs) and adaptive orchestration pipelines, we’re finally able to start delivering on those ambitious original ideas.
On the ACM ByteCast podcast, Fluid Interfaces head Pattie Maes talks to host Rashmi Mohan about the passion that has shaped her career.
Designing systems for cognitive support
Pattie Maes, head of the Fluid Interfaces group and Professor of Media Technology, reaches milestone of 500 peer-reviewed accepted papers.
Merrill, D. "Interaction with Embodied Media"
The following announcement is from Xinyi Zhou ([email protected]). Please contact them directly if you have any questions.
Hi everyone!
We at the Adaptive Computing Experiences (ACE) Lab at the University of Southern California are conducting a research study to investigate biases in Human-AI Software Development Teams. We would love to hear your thoughts and experiences!
I am recruiting individuals who meet the following criteria for the study:
If you decide to participate in this study, you will be asked to do the following activities:
During these activities, you will be asked questions about:
Participants in this study will receive a $40 reward after the final interview.
If you are interested in participating in this study, please click this link to fill out our survey. If you have questions, please contact me at [email protected].
Published on June 4th, 2024
Last updated on June 4th, 2024
IMAGES
VIDEO
COMMENTS
18. QUALITY COMP UTER LABS PROMOTE STUDENT SUCCESS. Adnan Omar and Muhammed Miah*. Department of Computer Information Systems, Southern Universi ty at New Orleans, 6801. Press Drive, New Orleans ...
The increasing use of personal and portable technologies such as smartphones and tablets, together with the availability of fabrication technologies, has led to recent reenvisioning of the computer laboratory as a space where digital technologies are used as part of integrated research, design, and realization activities.
The Computer Laboratory Environment Inventory (CLEI) and The Attitude Towards Computer and Computer Courses (ACCC) s Fisher, 1998). The sample consisted of 250 students taken from private ...
Based on the research that has been done, it’s concluded that: (1) description of computer laboratory facilities, students’ learning interest and students’ mathematics ...
The Computer Science and Artificial Intelligence Laboratory (CSAIL) pursues fundamental research across the entire breadth of computer science and artificial intelligence. CSAIL is committed to leading the field both in new theoretical approaches and in the creation of applications that have broad societal impact.
Research. The computing and information revolution is transforming society. Cornell Computer Science is a leader in this transformation, producing cutting-edge research in many important areas. The excellence of Cornell faculty and students, and their drive to discover and collaborate, ensure our leadership will continue to grow.
This study focuses on the computer laboratory class as a learning environment in university courses. It involved the development and validation of two instruments, the Computer Laboratory Environment Inventory (CLEI) and the Attitude towards Computing and Computing Courses Questionnaire (ACCC). The CLEI has five scales for measuring students ...
Computer-Assisted Medicine. Computer science research at the Johns Hopkins University is advancing computing technology, enabling new modes of thought, and transforming society. Our faculty conduct innovative, collaborative research aimed at solving large and complex interdisciplinary problems, drawing upon the university's renowned strengths ...
A computer laboratory is an expensive resource in terms of equipment and people, and should be used as effectively as possible. Computer laboratory classes may be organized as closed laboratories which are scheduled and staffed in the same way as other classes, or as open laboratories where students come and go as they please.
Six tools group leaders love. • AnyDesk is free software for accessing and controlling computers remotely. • Google's Apps Script automates actions across the Google application suite ...
Developed computer laboratory will benefit the students in research. 2. Undeveloped computer laboratory will not benefit the students in research "Improving the quality of computer labortatory for Senior High School students of Mount Carmel School of Maria Aurora Inc. (MCSMA): An Evaluation" 6 MOUNT CARMEL SCHOOL OF MARIA AURORA, (MCSMA) INC.
About. USC has a strong and active background in modern theoretical computer science, with research spanning a broad range of topics. Areas of particular interest include the theory of algorithms and optimization, graph theory, scalable algorithms, theory of machine learning, computational geometry, complex analysis, computational complexity, algorithmic number theory and cryptography.
Computer Science and Artificial Intelligence Laboratory (CSAIL) The largest interdepartmental laboratory at MIT that focuses on developing fundamental new technologies, conducting basic research that furthers the field of computing, and inspiring and educating future generations of scientists and technologists. View CSAIL.
The Computer Architecture (comparch) Lab conducts research on all aspects of future microprocessor technology including performance, power, multi-threading, chip-multiprocessing, security, programmability, reliability, interaction with compilers and software, and the impact of future technologies. Data Systems and Analytics Group
The Computer Architecture lab is dedicated to the engineering of machines that learn faster, compute more efficiently, operate safer, program easier, and integrate more responsibly. It aims to be an inviting and interdisciplinary space to rethink what a computer system can be. ... The group's research interests are in computer systems and ...
Welcome to the Computer Systems Lab (CSL) at Yale University.. CSL is an interdisciplinary laboratory with faculty from both Electrical Engineering and Computer Science that have a shared research interest in computer systems. CSL research encompasses both theory and practice in a wide range of domains including architecture, biologically-inspired computing, certified systems software ...
Massachusetts Institute of Technology. Computer Science & Artificial Intelligence Laboratory. 32 Vassar St, Cambridge MA 02139
The ELX lab conducts research in human-computer interaction, with a focus on learning technologies and technologies to support mental wellness. Research areas include wearable technologies for learning, enactment-based storytelling, Maker technologies in education and narrative-centered technological approaches. The goal of the lab is to design ...
Machina and Wild investigated the importance of ELNs when integrated with other computer tools, such as laboratory information management systems, analytical instrumentation, data management systems, and scientific data. They observed that the type of laboratory (analytical, synthesis, clinical, research) was a primary source of differences ...
Mission: The VLM lab is a research lab at the Computer Science and Engineering Department of the University of Texas at Arlington. At the VLM lab we are conducting research in the areas of computer vision, machine learning, and data mining. Areas of focus include gesture and sign language recognition, human motion analysis, detection and ...
The Prescience Lab conducts a range of experimental computer systems research with a current focus on virtualization, empathic systems, and parallel and distributed systems. This lab is headed by Peter Dinda. ... The VLSI Research Lab at NU focuses on the areas of energy efficient computing with digital and mixed signal design approaches. We ...
Research Labs. Automated Reasoning Group (Adnan Darwiche) Big Data and Genomics Lab (Eran Halperin) Biocybernetics Laboratory (Joe DiStefano) Center for Smart Health (Ramin Ramezani) Center for Vision, Cognition, Learning and Autonomy (Song-Chun Zhu) Cognitive Systems Laboratory (Judea Pearl) Computational Genetics Laboratory (Eleazar Eskin ...
The Computer Systems Research Laboratory is a research group made up of faculty members and students in the Department of Computer Science and Engineering at the University of North Texas. While we are interested in many topics within computer science, our current research focuses mainly on 3D RAM design, Processing-in-Memory and memory ...
The University of the Philippines (UP) Computer Security Group (CSG) is a research laboratory established in 2002. It is focused on the study of security mechanisms and protocols with a particular emphasis in the design and development of secure applications. As part of its affi….
Li's co-authors include Ruonan Han, an associate professor in EECS, leader of the Terahertz Integrated Electronics Group, and member of the Research Laboratory of Electronics (RLE); senior author Dirk Englund, professor of EECS, principal investigator of the Quantum Photonics and Artificial Intelligence Group and of RLE; as well as others at ...
Enhancing engineering education through mini project-based learning in computer integrated manufacturing laboratory: A student-centric approach. Ravikantha Prabhu ... holds a Master's degree from Srinivas Institute of Technology and a Ph.D. from VTU, Belagavi. His research focused on biodiesel with aqueous nanoparticles. His interests include ...
Caltech does not discriminate or permit discrimination by any member of its community on the basis of sex, race, color, religion, national origin, citizenship, ancestry, age, marital status, physical or mental disability, medical condition, genetic information, pregnancy or perceived pregnancy, gender, gender identity or expression, sexual orientation, protected veteran status, or any other ...
A research team led by Lawrence Berkeley National Laboratory (Berkeley Lab) says that they are the first to use a femtosecond laser to create and "annihilate" qubits on demand, and with precision, by doping silicon with hydrogen. The advance could enable quantum computers that use programmable optical qubits or "spin-photon qubits" to ...
Building AI agents with Semantic Kernel. By Simon Bisson. Back in the early 1990s, I worked in a large telecoms research lab, as part of the Advanced Local Loop group. Our problem domain was the "last mile"—getting services to peoples' homes. One of my research areas involved thinking about what might happen when the network shift from ...
The following announcement is from Xinyi Zhou ([email protected]). Please contact them directly if you have any questions. Hi everyone! We at the Adaptive Computing Experiences (ACE) Lab at the University of Southern California are conducting a research study to investigate biases in Human-AI Software Development Teams. We would love to hear your thoughts and experiences! I am recruiting ...