Abstract

In the fourth industrial revolution to develop new products and processes, the digital twin, virtual copies of the system that can interact with the physical counterparts in a bidirectional way, seem to be promising enablers to replicate production systems in real time and analyze them. They aim to solve insufficient guidance methods in the existing enterprise service remote assistance guidance system. In this paper, a digital twin-driven enterprise service remote assistance guidance system is proposed. The digital twin system is designed to carry out different all-around analyses of the remote internal system. The digital and physical spaces of the enterprise service system are reset according to the data query results. The proposed model achieves the internal data mapping effect of the enterprise service and analyzes the internal data of the system. Based on the realization of real-time mapping and a large amount of twin data generated by virtual and real interaction, the data are visualized and stored in a database for the upper layers. The proposed model has been simulated, and the test results show its potential benefits for enterprise control, optimization, and forecasting and can provide essential support for realizing the twin’s optimized control of entities.

1. Introduction

As the next-generation manufacturing system, intelligent manufacturing enables better quality, higher productivity, lower cost, and increased manufacturing flexibility [1]. The evolution of new-generation information technologies, including the Internet of Things (IoT), cloud computing, artificial intelligence, machine learning, big data analytics, and cyber-physical system, contributes significantly to manufacturing to complete more efficient, competitive, and more intelligent manufacturing [2]. Through information technology (IT), smart manufacturing pursues a high degree of intelligence with the powers ubiquitous sensing, advanced computing and data analytics, and low cost. The convergence of different advanced and associated devices and machines is the inevitability for smart manufacturing. In addition, the rising ubiquity of IoT provides prominent opportunities to develop powerful industrial systems and applications [3]. Consequently, these connected sensors, machines, and devices generate the bulk of various manufacturing data. These data need to be filtered, processed, and stored to produce information, which is the basis for smart manufacturing [4]. However, the exponential growth of data goes beyond the general processing capabilities of users [5].

Cloud computing can deal with a high volume of data in many cases. Cloud computing is an Internet-based computing platform where the shared resources are accessed and used on-demand efficiently [6]. In the cloud computing model, users can access high-quality services at a low cost. Due to the potential and practical benefits to society and the economy, the cloud computing paradigm has attracted greater attention from academia and industry. Combined with manufacturing, cloud computing contributed to a new cloud-based manufacturing model. All manufacturing resources and capabilities are digitized and encapsulated as services to be managed, allocated, and on-demand used through the cloud [7].

Digitalization, especially digital twin, precise virtual copies of systems or machines, is modernizing the industry. Many companies and enterprises already use the digital twin to solve problems and enhance efficiency [7]. With the development of IT, especially the Internet of Things (IoT), big data analytics, cloud computing, and machine learning, the digitalization process is significantly accelerating. Through the convergence of the physical and virtual worlds, digitalization is becoming one of the key driving forces for innovation in all sectors. Due to the shortcomings of digitalization tools, initially, only the most indispensable information is digitalized for storage, processing, and transfer. With the advent of the Internet and advanced control technologies, business digitalization provided new revenue and value-producing opportunities for enterprises. The emergence of cloud computing, big data, and AI makes it possible to gradually converge the physical and virtual worlds toward the digitalization of industrial ecology.

Digital twins can effectively optimize the control capabilities of enterprises, so they have wide applications in society. The digital twin can simulate the selected physical object to obtain real data and then transform it into research and judgment information, maintain a high degree of consistency with the physical object, control the overall efficiency in real time, and change the direction of information in time. The American Gartner company believes that digital twin technology can play a recombination role in social development, so it is listed as one of its strategic goals for future development [8]. The digital twin technology can change the future aerospace landscape, enhance the country’s overall national defence capabilities, and promote the development of economic globalization [9]. The term DT was applied in industry 4.0 by Siemens in 2016 [10]. Tao et al. [11] suggested the concept of the DT shop floor in January 2017. They explained the features, composition, and operation mechanism of the DT shop floor, which provided theoretical support for the use of DT in manufacturing. Later on, Tao et al. [12] proposed the five-dimension DT model to encourage further applications of DT in more fields.

The use of DT technology to innovate can effectively improve the overall innovation and creativity of a system. The actual innovation factors in a series of physical spaces, such as the main body and elements of the system, are fully mapped to the information space through digital expression. In the mapping process, using the data of these innovation factors, a comprehensive virtual simulation model integrating all elements is constructed, the virtual simulation model of the innovation system in the information space. “Twins”: This virtual model can be adjusted on the system parameters to simulate, monitor, diagnose, predict, and control the innovative system in the physical space through the “virtual control of the real” method. The innovation system is optimized and iterated to improve the system’s integrity, reduce the risk, and finally, complete the accurate expression of the innovation system through the interaction of data and information. After a large amount of data is collected, efficient analysis and integration are carried out, and value information is reorganized and fitted to form an information model using an entirely reproduced form. This new type of processing method is an innovative combination of DT system and remote enterprise service, which can effectively improve the enterprise’s overall efficiency and greatly enrich the paradigm research of DT technology participating in innovation activities. Twin technology provides theoretical guidance for carrying out innovative activities and provides a reference for the academic community to continue studying this field in the future. In this study, a digital twin-driven enterprise service remote assistance guidance system is presented. The digital twin system is used to carry out further all-around analyses of the remote internal system. The enterprise service system’s digital space and physical space are reorganized according to the data query results, and the data communication network is established. It also provides big data support for enterprise control, optimization, and forecasting. It provides essential support for realizing the twin’s optimized control of entities.

The rest of the paper is organized as follows. Section 2 discusses the mapping of DT real-time mapping. Section 3 illustrates the proposed system environment, network establishment, and web application. Finally, Section 4 is about performance evaluation, and Section 5 concludes the proposed work.

2. Twin Model Real-Time Mapping

DT represents the development of digitalization. It is progressively applied in more and more areas, such as smart manufacturing, building management, smart city, healthcare, oil and gas, enterprise control, and many more fields. For example, in the enterprise remote control, the DT analyzes the real-time dynamics and behaviors inside the physical space and efficiently maps them to achieve data matching. While collecting data and information inside the physical space, many signals are processed, and the data is gradually integrated using big data capabilities to achieve efficient integration of the physical space and the digital space. Based on the realization of real-time mapping and a large amount of twin data generated by virtual and real interaction. It analyzes and integrates data such as equipment status and system operation status to provide the upper layer with data visualization and database storage on the production site. In this way, it provides a more transparent and multilevel perspective for the supervisory layer and provides big data for big data algorithms such as production control, optimization, and forecasting. It provides basic support for realizing the twin’s optimized control of entities.

2.1. Virtual and Real Interaction Structure of Twin System

Unlike traditional production line simulation, the core of DT technology is to realize the interactive integration of virtual and real data space [13]. Therefore, based on the realization of the production line model and the communication interface, the virtual-real interaction structure of the twin system is designed. The schematic diagram of the proposed virtual-real interaction structure is shown in Figure 1.

During the production run, the server performs data queries and analyses based on the interactive results of the system. It uploads the data results to the virtual-real interaction space of the DT system after obtaining the required results. The digital space drives the real-time mapping of the twin model to the physical system based on the received real-time information. It provides the management with a full range of real-time monitoring such as enterprise resource planning (ERP), manufacturing execution system (MES), and product lifecycle management (PLM). Furthermore, it carries out relevant data statistics on the operation of the digital space. On the one hand, statistical data is used for status analysis, and abnormal conditions are fed back to the management. On the other hand, the effective date of the digital space is stored in the database to support data analysis. In this continuous interaction and iteration process, twin space continuously improves the production process, thereby completing the virtual-real interaction process of the digital twin production system based on data interaction [14]. The following section discusses the realization of virtual-real interaction and the processing of simulation data, respectively.

2.2. Realization of Real-Time Mapping

Real-time mapping is the process of instantaneous and dynamic visualization of an environment. Real-time mapping is an indispensable ability in digital space while processing information and data. It is the basis for realizing the interaction between reality and virtual information. Compared with the traditional production line simulation, the DT system is of great significance to the realization of intelligent manufacturing. Real-time mapping has the advantages of high efficiency, convenience, and improved performance. Furthermore, it can realize real-time manipulation of the real physical space in the virtual information space, which greatly improves the overall efficiency. The mapping and interaction between digital space and physical space are mainly divided into four parts:(i)Production: the production process is the core production step of a product. Therefore, the product’s life cycle from the processing of parts to the later packaging and delivery of the product belongs to the product’s life cycle, so the product’s production process is very important. Therefore, the product process data, performance data, and production data must be stored in virtual dynamic tags to drive data.(ii)Mode: the overall production process of the production line can be improved through the setting of the program, mode setting of the action, running performance, and spatial position. Other elements in the production process can be completed to complete the processing equipment control, such as robots in the production line. The mode can help to finish product processing and production.(iii)Space: digital planning and management of the processing space’s internal workflow and working parameters through real-time observation of the actual parameter changes in the space. Timely modification of the virtual space value and adequate early warning of environmental parameters are essential elements of real-time mapping systems.(iv)Design: use real-time mapping technology to design and analyze the production process, production plan, process schedule, and other production processes. It can effectively inquire about the internal process conditions and logistics status of the production. Moreover, it can improve the information processing ability of the enterprise. The centralized inquiry of the number of products can improve the analysis and management of the digital space. In the process of product production, real-time mapping can be used to make intelligent changes to the logistics transportation, manufacturing capacity, and process inspection of the product and improve the analysis and refining capabilities. The intelligent production system combines the ability of virtual space value query and management to increase the speed of the production process. At the same time, the accuracy of query information is also improved to provide data support for enterprises. Real-time mapping can be efficiently realized based on the four parts of production, mode, space, and design.

3. System Design

The design of a DT-based system is based on multiple components, among which components such as monitoring servers, web applications, and data models are essential, which can optimize the DT system and improve the remote monitoring capabilities of the enterprise.

3.1. System Environment

In terms of system design, Windows systems with strong applicability and outstanding matching capabilities are usually selected. In addition, the effective operations of the Windows system can provide improved graphics and universality, support a variety of data, and be flexibly applied to software system design.

3.2. Monitoring Server Core Framework Selection

The monitoring server in the remote information service management system is one of the most important core frameworks. The monitoring server monitors the data request of the player or mobile phone software in the network TV client, monitors network applications, and provides timely feedback according to the monitoring contents. The integrated data after the player and the built-in software of the mobile phone are combined, and the content is refined and analyzed. In the actual data transmission process, the monitored transmission contents usually contain several private data of the users, so there are significant requirements to ensure the security and confidentiality of the network. Therefore, the HTTP request based on the HTTP protocol is required when the core framework of the monitoring server is designed. In this way, user data and information security are achieved. The HTTP protocol is a short-lived connection mode in the actual application process. It is a single-phase transmission mode, so it can effectively keep the user’s private information confidential. When the system monitors the user, each information data is exchanged and communicated with different clients. The HTTP is used for information exchange. Each time the information sent will pass through the service customer. However, based on the characteristics of the HTTP request mode, the server will not store the information after receiving it. At the same time, it will not actively maintain the client, which guarantees the user’s privacy to a great extent. The socket is used to handle the communication channel in the actual design process. It is used to connect the Internet Protocol (IP) address and port. It can control the transmission of data through address positioning and can realize high-efficiency monitoring services. Among them, the Transmission Control Protocol (TCP) protocol controls the system to transmit data [15]. In this paper, the three-way handshake mechanism of the TCP protocol is used to ensure the reliability of data transmission through two-way network intercommunication, as shown in Figure 2.

For Socket connections, the Super Socket network lightweight programming framework can be used to reduce the amount of detailed data processing within a working range and increase the connection creation speed. The TCP protocol contains three independent data modules, which carry out the different divisions of labor to process information to maintain and manage the communication between the server and the client. In the application process, Super Socket can monitor multiple ports, optimizing the monitoring server core system and improving overall performance. The one-to-many monitoring framework of Super Socket is embodied in its monitoring method. The Super Socket monitoring server is composed of multiple different modules corresponding to different client ports. It can monitor different ports simultaneously, such as command ports and modified ports, and log ports. The object model of the Super Socket framework includes three main modules: application server, session server, and socket server, which use different commands to process and refine sessions [16]. The socket server is responsible for sending and receiving all information data. The application server is responsible for filtering and screening the received information and transmitting the effective information to the command unit for the next data processing, as shown in Figure 3.

The Super Socket framework can effectively improve the system’s overall test and analysis capabilities, so the range of program objects involved is more comprehensive. The functions of the object types designed by the Super Socket framework application program are summarized, as given in Table 1.

Super Socket framework request processing is divided into socket server session stage, application server session stage, and message processing stage. First, the socket server is used to obtain information about different clients. Then, the corresponding session system is established, and the request is made for its verification. It can only receive and send the session at this stage and cannot effectively process the information. Secondly, after receiving the information, it is transferred to the application server for processing, and an application server session is created based on the information of the monitoring object. At the same time, the session is started for data processing, and the initial data is filtered to remove some illegal and unwanted information with certain characteristics. Third, the application server establishes a Request Info data structure based on the complete data after analysis and restoration and classifies the information data after filtering, analysis, and reorganization to form multiple Request Info modules. Finally, the application server completes the corresponding information processing according to different instructions under the execution unit module based on the split Request Info module’s properties to complete the entire Super Socket request processing process. The Super Socket request processing model is shown in Figure 4.

3.3. Web Application Core Framework Selection

Web applications have the advantages of simplicity, convenience, and efficiency in actual use. In terms of processing information, it can provide users with different levels of authority and functions through efficient data query and collection capabilities and improved usage. Web applications can provide users with personalized services according to their usage attributes, and at the same time actively search for corresponding data and information for users according to their different needs. It provides diversified services, can efficiently process interactive information, and provides timely feedback to users to use information. Web application core framework can choose two different types, which can optimize the development of the system. The current application scope is mainly divided into client/server model and browser/server mode [17]. The client/server model uses the division of labor to reorganize and distribute data and information, which can effectively improve the confidentiality of data, ensure the integrity of information, and have flexible processing capabilities. The browser/server model can deal with data problems comprehensively. It can control data verification, debugging, and deployment and can conduct multiple research and analysis on data information to achieve accuracy. In this study, ASP.Net is employed as the core framework of web application which is based on the client/server model. ASP.NET technology can integrate the compilation of complex program data code into a simple one and effectively separate the business data and page information within the system and is more suitable for large-scale Web applications and development. In the current software system design, ASP.NET technology is mainly used for hierarchical structure design to strengthen the system’s data collection and processing capabilities. Based on this hierarchical structure, the user presentation layer, the business logic layer, and the data access layer are used to process the pages, business, and data in batches, respectively. Through the cooperation of the high-level and the bottom-level layers, the core data capabilities of the system are refined. The functionalities of these layers are explained as follows: (i)The user presentation layer: this layer is the highest layer of ASP.NET technology for hierarchical structure design. It realizes the display of content and the interaction between users by collecting all the components of the website page to realize the interaction between users and the users of the system. After receiving the dynamic information sent by the business logic layer, this layer performs data proofreading and simple processing and then returns it to the business logic layer.(ii)The business logic layer: this layer performs the logic architecture analysis that belongs to the core of the Web application. It can actively call the data receiving user presentation layer to send information, perform secondary processing, and upload the processed information to the operating data space and data access layer.(iii)The data access layer: this layer is the lowest layer of ASP.Net technology for hierarchical structure design and cooperates with the business logic layer for data search and access. The stored information can be searched, added, deleted, and modified according to the information processing requirements uploaded by the business logic layer. Finally, after all the layer information processing is completed, it is uploaded to the master model module. The framework of these layers is shown in Figure 5.

3.4. Digital Spatial Data Processing

The mapping of the twin model to the entity is the basic application of the DT, which provides visual management for managers. Its more critical function is to conduct deeper statistics and analysis of key data in production and operation through the digital space. The key statistics of the digital space are given in Table 2. The data statistics function is mainly for the upper management system and managers. It provides statistical results for the production line operation and provides the management of visual management and decision-making data support, which is convenient for the adjustment of production plans and control optimization. The underlying statistical data in the digital space originates from various equipment components and virtual service components. Each component can be built with statistical attributes as shown in Figure 6. Moreover, a series of statistical variables are defined in the attributes according to requirements. The statistical algorithm of each variable was constructed in a script program, and the variables were continuously calculated and counted through the analysis of real-time signals and the cooperation of the system’s running time.

4. Performance Evaluation

4.1. Response Time

The performance of the cloud database service system was mainly evaluated in terms of response time and the number of concurrent users. The performance of the MySQL database instance was mainly tested and analyzed through the task processing time, using tasks per second, several queries, CPU computing performance, and disk IO performance. Different levels of analysis and testing were performed. With the help of the load test and benchmark pressure test, the performance of the system can be assessed effectively. Using the Siege simulation experiment, it is possible to simulate N users repetitively visiting a certain URL R times and return the test results in JavaScript object notation (JSON) format. The important parameter is elapsed time, which represents the test time. The command format of the Siege stress test is “Siege -r number -c number”. The number after -c indicates the number of concurrent tests, and the number after -r indicates the number of tests. Siege is an open-source regression test and benchmark utility. It can stress test a single URL with a user-defined number of simulated users, or it can read many URLs into memory and stress them simultaneously. In this study, Siege was used to simulate 1-2 W users to access the system at the same time in a cycle to obtain the system response time change graph. The response time of the system performance test is shown in Figure 7. It can be seen that the system performance is better when the concurrent users of the system are within 2 W.

4.2. Transactions per Second

This test selects the number of transactions executed per second by TPS (transactions per second) as an indicator to measure the performance of the MySQL cloud database instance. The configuration of the database instance is 2 cores 4G100G. By executing the same instructions in the local database and the cloud database systems under the same conditions, the comparison results for TPS are shown in Figure 8.

It is evident that the MySQL database instance TPS created in the cloud database service system designed in this experiment is slightly better than the local database instance TPS, and the performance of the cloud database instance has reached a maximum. It is clear that while increasing the number of transactions being executed per second, the system shows consistently better performance. The experimental results confirmed that it can meet the performance requirements of users for cloud database instances.

5. Conclusion

This article introduced the deployment environment and networking topology capabilities of the remote assistance guidance system for enterprise services driven by digital twins. Subsequently, a matching digital twin system was proposed for the remote assistance guiding system of the enterprise, and the functional test of each module was accomplished in the physical space. The comprehensive results of the performance test showed that the digital twin system can efficiently enhance the management and control capabilities of the enterprise, optimize the efficiency of the enterprise, and predict the future development trend of the enterprise. The proposed model was simulated and the test results show its potential benefits for enterprise control, optimization, and forecasting and can provide basic support for realizing the twin’s optimized control of entities. The service-oriented DT can also be the source of several new knowledge-driven business opportunities. This knowledge can be also conveniently provided to the enterprise’s stakeholders, such as customers, suppliers, and partners.

Data Availability

The data used to support the findings of this study are included within the article.

Conflicts of Interest

The authors declare no conflicts of interest.