×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Reviews, suggestions and discussions

  • System of criteria for assessing level of large-scale construction in far north conditions

    Abstract in English: Construction in the Far North presents a unique challenge requiring the adaptation of technologies, materials and methods to extreme climatic conditions. The article analyzes the world experience, including geotechnical research, architectural solutions, infrastructure and environmental aspects. Examples of successful projects are given, which illustrates the effectiveness of using modern technologies and methods to ensure sustainable development in the North regions. A system of criteria for assessing the level of large-scale construction in the Far North conditions is proposed.

    Keywords: construction, technology, northern territories, low resource consumption, waste minimization, minimization of environmental impact, modular construction, system of criteria

  • Time-frequency analysis of signals using EMD, ITD and VMD algorithms

    The article describes the mathematical foundations of time-frequency analysis of signals using the algorithms Empirical Mode Decomposition (EMD), Intrinsic Time-Scale Decomposition (ITD) and Variational Mode Decomposition (VMD). Synthetic and real signals distorted by additive white Gaussian noise with different signal-to-noise ratio are considered. A comprehensive comparison of the EMD, ITD and VMD algorithms has been performed. The possibility of using these algorithms in the tasks of signal denoising and spectral analysis is investigated. The estimation of algorithm execution time and calculation stability is performed.

    Keywords: time-frequency analysis, denoising, decomposition, mode, Hilbert-Huang transformation, Empirical Mode Decomposition, Intrinsic Time-Scale Decomposition, Variational Mode Decomposition

  • Technical science. Informatics, computer facilities and management

  • A fuzzy comparison method for managing the functioning of an organizational system

    Modern digitalization processes involve the use of intelligent systems at key stages of information processing. Given that the data available for intelligent analysis in organizational systems are often fuzzy, there is a problem of comparing the corresponding units of information with each other. There are several known methods for such a comparison. In particular, for random fuzzy variables with known distribution laws, the degree of coincidence of these distribution laws can be used as a criterion for the correspondence of one random variable to another. However, this approach does not have the necessary flexibility required to solve practical problems. The approach we propose allows you to compare fuzzy, fuzzy and clear, as well as clear and clear data. The paper will provide an example illustrating this approach. The material presented in the study was initially focused on managing organizational systems in education. However, its results can be extended to other organizational systems.

    Keywords: fuzzy data, weakly structured problems, comparison criteria, hierarchy analysis method, systems analysis, fuzzy benchmarking

  • Development of an information and analytical system based on GIS technology in the field of rational management of forest resources: stages, methods, examples

    The article presents the main stages and recommendations for the development of an information and analytical system (IAS) based on geographic information systems (GIS) in the field of rational management of forest resources, providing for the processing, storage and presentation of information on forest wood resources, as well as a description of some specific examples of the implementation of its individual components and digital technologies. The following stages of IAS development are considered: the stage of collecting and structuring data on forest wood resources; the stage of justifying the type of software implementation of the IAS; the stage of equipment selection; the stage of developing a data analysis and processing unit; the stage of developing the architecture of interaction of IAS blocks; the stage of developing the IAS application interface; the stage of testing the IAS. It is proposed to implement the interaction between the client and server parts based on Asynchronous JavaScript and XML (AJAX) technology. It is recommended to use the open source Leaflet libraries for visualization of geodata. To store large amounts of data on the server, it is proposed to use the SQLite database management system. The proposed approaches can find application in the creation of an IAS for the formation of management decisions in the field of rational management of forest wood resources.

    Keywords: geographic information systems, forest resources, methodology, web application, AJAX technology, SQLite, Leaflet, information processing

  • A mathematical model of a fault-tolerant nonlinear conversion unit for OFDM wireless systems with frequency hopping

    With the development of low-orbit satellite Internet systems (NSIS), issues of ensuring effective operation in conditions of intentional interference come to the fore. One of the solutions is related to the use of systems using both OFDM methods and generators implementing frequency hopping (HF). Obviously, the more complex the algorithm for selecting operating frequencies, the more efficient the operation of the microwave. In the article, it is proposed to use the SPN cipher "Grasshopper" as a generator for selecting operating frequencies. As a result, the CCF system will have a high resistance to calculating operating frequency numbers by electronic warfare systems. However, failures and failures may occur during the operation of the SSC. To prevent their consequences, it is proposed to implement an SPN cipher using polynomial modular codes of residue classes (PMCC). One of the transformations in the "Grasshopper" is a nonlinear transformation that performs the substitution operation. It is obvious that the creation of a new mathematical model for performing a nonlinear transformation using MCCS will ensure the operation of the SPN-cipher-based RF generator in conditions of failures and failures.

    Keywords: low-orbit satellite Internet systems, the Grasshopper SPN cipher, nonlinear transformations, modular codes of residue classes, mathematical model, fault tolerance, frequency hopping, polynomial modular code of residue classes

  • Experience of integrated domestic systems for information modeling of infrastructure on the example of Vitro-CAD Common Data Environment and Topomatic Robur software

    More attention is being paid to the transition to domestic software with the digitalisation of the construction industry and import substitution. At each stage of construction, additional products are needed, including CAD and BIM. The experience of integration of Russian-made systems for the tasks of information modeling of transport infrastructure and road construction is considered. Within the framework of the work the integration of Vitro-CAD CDE and Topomatic Robur software system was performed. Joint work of the construction project participants in a single information space was organized. The efficiency of work of the project participants was determined due to the release from routine operations. Integration experience has shown that the combination of Vitro-CAD and Topomatic Robur allows to manage project data efficiently, store files with version tracking, coordinate documentation and issue comments to it.

    Keywords: common data environment, information space, information model, digital ecosystem, computer-aided design, building information modeling, automation, integration, import substitution, software complex, platform, design documentation, road construction

  • Choosing the type of mother wavelet in fractal analysis in the problem of detecting computer attacks

    The study of statistical characteristics of network traffic allows us to detect its fractal features and estimate how the fractal dimension changes under cyber attacks (CA). These studies highlight the relationship between attacks and dynamic changes in the fractal dimension, which allows us to better understand how attacks affect the structure and behavior of network traffic. Such understanding is critical for developing effective methods for monitoring and protecting networks from potential threats. These observations justify the use of fractal analysis methods, including discrete wavelet analysis, for detecting CA. In particular, it is possible to monitor the fractal dimension of telecommunication traffic in real time with tracking its changes. However, the choice of the most appropriate mother wavelet for multiresolution analysis remains an insufficiently studied aspect. The article evaluates the influence of the choice of the mother wavelet type on the estimate of the Hurst exponent and the reliability of CA detection. The following types of mother wavelets are considered: Haar, Daubechies, Simlet, Meyer and Coiflet. The study included an experimental evaluation of the Hurst exponent on a data set that includes a SYN flood attack and normal network traffic. It was shown that the minimum spread of the Hurst exponent estimate for traffic with SYN flood attacks is achieved when using the Meyer mother wavelet with an analysis window of more than 10,000 samples and the Haar wavelets with an analysis window of less than 10,000 samples.

    Keywords: mother wavelet, computer attack, network traffic, Hurst exponent, wavelet analysis, fractal dimension

  • One of the Approaches to Analyzing Source Code in Student Projects

    When evaluating student work, the analysis of written assignments, particularly the analysis of source code, becomes particularly relevant. This article discusses an approach for evaluating the dynamics of feature changes in students' source code. Various metrics of source code are analyzed and key metrics are identified, including quantitative metrics, program control flow complexity metrics, and the TIOBE quality indicator. A set of text data containing program source codes from a website dedicated to practical programming, was used to determine threshold values for each metric and categorize them. The obtained results were used to conduct an analysis of students' source code using a developed service that allows for the evaluation of work based on key features, the observation of dynamics in code indicators, and the understanding of a student's position within the group based on the obtained values.

    Keywords: machine learning, text data analysis, program code analysis, digital footprint, data visualization

  • Sufficient condition for stability of a dynamic system of stabilization of equilibrium position and programmed motions

    In the article, based on the estimate of the Euclidean norm of the deviation of the coordinates of the transition and stationary states of the dynamic system, the compression condition of the generalized projection operator of the dynamic system with restrictions is derived. From the principle of contracting mappings, taking into account the derived compression condition of the projection operator, estimates are obtained for the sufficient condition for the stability of the dynamic system of stabilization of the equilibrium position and program motions. The obtained estimates generalize the previously obtained results. Ensuring the stability of the operator of a limited dynamic system is demonstrated experimentally.

    Keywords: sufficient condition for stability, projection operator, stabilization of equilibrium position. stabilization of program motions, SimInTech

  • Evaluation of the complexity of algorithms for building a dominator tree in the context of the implementation of algorithms for data flow analysis in the implementation of the frontend compiler of the Solidity programming language

    This article discusses two of the most popular algorithms for constructing dominator trees in the context of static code analysis in the Solidity programming language. Both algorithms, the Cooper, Harvey, Kennedy iterative algorithm and the Lengauer-Tarjan algorithm, are considered effective and widely used in practice. The article compares these algorithms, evaluates their complexity, and selects the most preferable option in the context of Solidity. Criteria such as execution time and memory usage were used for comparison. The Cooper, Harvey, Kennedy iterative algorithm showed higher performance when working with small projects, while the Lengauer-Tarjan algorithm performed better when analyzing larger projects. However, overall, the Cooper, Harvey, Kennedy iterative algorithm was found to be more preferable in the context of Solidity as it showed higher efficiency and accuracy when analyzing smart contracts in this programming language. In conclusion, this article may be useful for developers and researchers who are involved in static code analysis in the Solidity language, and who can use the results and conclusions of this study in their work.

    Keywords: dominator tree, Solidity, algorithm comparison

  • Dynamics of electromagnetic ferro-powder coupling unit

    The paper proposes and specifies the engineering mathematical model of the drive using a block of ferrofluid couplings and, within the framework of preliminary design, studies its dynamic properties, in the course of which transient characteristics are obtained, allowing to assess the qualitative performance of such a drive and to outline measures for their improvement.

    Keywords: uninterruptible power supply system, automatic regulation, relay voltage regulator, relay system modeling.

  • Three-component flow of requests in closed queuing systems with endless storage capacity and waiting time limit

    This article explores the probabilistic characteristics of closed queuing systems, with a particular focus on the differences between "patient" and "impatient" demands. These categories of requests play a crucial role in understanding the dynamics of service, as patient demands wait in line, while impatient ones may be rejected if their waiting time exceeds a certain threshold. The uniqueness of this work lies in the analysis of a system with a three-component structure of incoming flow, which allows for a more detailed examination of the behavior of requests and the influence of various factors on service efficiency. The article derives key analytical expressions for determining probabilistic characteristics such as average queue length, rejection probability, and other critical metrics. These expressions enable not only the assessment of the current state of the system but also the prediction of its behavior under various load scenarios. The results of this research may be useful for both theoretical exploration of queuing systems and practical application in fields such as telecommunications, transportation, and service industries. The findings will assist specialists in developing more effective strategies for managing request flows, thereby improving service quality and reducing costs.

    Keywords: waiting, queue, service, markov process, queuing system with constraints, flow of requests, simulation modeling, mathematical model

  • The Case Based Reasoning Method for decision making when oil spills on oilfield

    Oil spills require timely measures to eliminate the causes and neutralize the consequences. The use of a case-based reasoning is promising to develop specific technological solutions in order to eliminate oil spills. It becomes important to structure the description of possible situations and the formation of a representation of solutions. In this paper, the results of these tasks are presented. A structure is proposed for representing situations in oil product spills based on a situation tree, a description of the algorithm for situational decision-making using this structure is given, parameters for describing situations in oil product spills and presenting solutions are proposed. The situation tree allows you to form a representation of situations based on the analysis of various source information. This approach makes it possible to quickly clarify the parameters and select similar situations from the knowledge base, the solutions of which can be used in the current undesirable situation.

    Keywords: case-based reasoning; decision making; oil spill, oil spill response, decision support, situation tree

  • Development of an algorithm and software implementation of a module for correlation and regression analysis and visualization of user activity

    The purpose of the article is a software implementation of a module for analyzing the activity of site users based on a heat map of clicks, compatible with domestic web services, for example, combining the functionality of correlation and regression analysis and visualization in the form of dashboards before and after making changes to site elements. All functionality is carried out directly in the web analytics service. Based on the data obtained on the analyzed site element, a decision is made to adjust the design and/or content to increase the click rate. Thus, the proposed solution allows us to expand the functionality of the web analytics service and reduce labor costs. The software module has been successfully tested. As a result of the analysis and making the necessary adjustments to the site, the click rate increased

    Keywords: user activity, correlation and regression analysis, dashboard, program module, trend line, coefficient of determination

  • Modeling the Random Forest Machine Learning Algorithm Using the Mathematical Apparatus of Petri Net Theory

    The article considers the possibility of modeling the random forest machine learning algorithm using the mathematical apparatus of Petri net theory. The proposed approach is based on the use of three types of Petri net extensions: classical, colored nets, and nested nets. For this purpose, the paper considers the general structure of decision trees and the rules for constructing models based on a bipartite directed graph with a subsequent transition to the random forest machine learning algorithm. The article provides examples of modeling this algorithm using Petri nets with the formation of a tree of reachable markings, which corresponds to the operation of both decision trees and a random forest.

    Keywords: Petri net, decision tree, random forest, machine learning, Petri net theory, bipartite directed graph, intelligent systems, evolutionary algorithms, decision support systems, mathematical modeling, graph theory, simulation modeling

  • Supporting decision-making in emergency risk conditions based on the analysis of unstructured data

    There is often a need to analyze unstructured data when assessing the risk of emergency situations. Traditional analysis methods may not take into account the ambiguity of information, which makes them insufficiently effective for risk assessment. The article proposes the use of a modified hierarchy process analysis method using fuzzy logic, which allows for more effective consideration of uncertainties and subjective assessments in the process of analyzing emergency risks. In addition, such methods allow for consideration of not only quantitative indicators, but also qualitative ones. This, in turn, can lead to more informed decisions in the field of risk management and increased preparedness for various situations. The integration of technologies for working with unstructured data in the process of assessing emergency risks not only increases the accuracy of forecasting, but also allows for adapting management strategies to changing conditions.

    Keywords: artificial intelligent systems, unstructured data, risk assessment, classical hierarchy analysis method, modified hierarchy analysis method, fuzzy logical inference system

  • Development of an automated decision support system for technological preparation of additive manufacturing

    When selecting technology options for additive manufacturing, it is important to be guided by a list of basic requirements for manufactured parts, powder material features, and the range of equipment specifications. To develop the most effective solution, pilot testing of the technology will be required to determine the preferred process modes that ensure the required quality, cost, and manufacturing time. Accordingly, for such industries, it is important to develop a model that describes the stages of additive manufacturing. In this regard, this paper is devoted to considering the issues of optimization, planning, and management of additive manufacturing technology based on multiple criteria for selecting the most effective technology that involves optimal loading of production facilities. The proposed model introduces indicators that characterize adaptation and allow arguing the practical benefits of using 3D printing technologies. Comparison of process routes by these indicators allows choosing a specific manufacturing route. Having selected the manufacturing option, it is possible to calculate the most efficient placement of products on platforms, as well as to develop a suitable sequence of execution of the main manufacturing operations in order to minimize costs and time expenditures. It was found that the use of the proposed model allows to reduce the manufacturing time of a batch of products by approximately 2.5%. Thus, the proposed model can be useful in additive manufacturing to reduce downtime of installations and speed up the release of products.

    Keywords: production organization, productivity assessment, equipment reconfiguration, machine loading, automation, multi-product production

  • Image compression method based on the analysis of the weights of the detailing coefficients of the wavelet transform

    Many modern information processing and control systems for various fields are based on software and hardware for image processing and analysis. At the same time, it is often necessary to ensure the storage and transmission of large data sets, including image collections. Data compression technologies are used to reduce the amount of memory required and increase the speed of information transmission. To date, approaches based on the use of discrete wavelet transformations have been developed and applied. The advantage of these transformations is the ability to localize the points of brightness change in images. The detailing coefficients corresponding to such points make a significant contribution to the energy of the image. This contribution can be quantified in the form of weights, the analysis of which allows us to determine the method of quantization of the coefficients of the wavelet transform in the proposed lossy compression method. The approach described in the paper corresponds to the general scheme of image compression and provides for the stages of transformation, quantization and encoding. It provides good compression performance and can be used in information processing and control systems.

    Keywords: image processing, image compression, redundancy in images, general image compression scheme, wavelet transform, compression based on wavelet transform, weight model, significance of detail coefficients, quantization, entropy coding

  • Methods for using large language models to identify provocative comments on social media during protests

    This paper explores methods for using large language models (LLM) to detect provocative comments on social media during mass protests. We analyze existing approaches to text data processing and develop a methodology for using LLama3 8B, Mistral 7B, and Gemma 7B models. The effectiveness of the models is evaluated using comments related to the protests in Mongolia in December 2022. The results demonstrate that the Mistral 7B model has the highest accuracy and efficiency in classifying provocative comments. The findings confirm that the use of large language models significantly improves the accuracy and speed of content analysis on social media, which is important for public opinion management and conflict prevention.

    Keywords: large language model, social network, provocative comment, mass protest, text analysis, natural language processing, LLama3 8B, Mistral 7B, Gemma 7B, classification, accuracy, recall, F1-score, political instability, public opinion management

  • Solving the problem of collecting and preparing data and knowledge coming from several information systems for their intelligent processing (by the example of registrar's reporting)

    The paper considers the task of collection and preparation of data coming from several information systems on the example of automation of registrar's reporting. The languages OWL, XML, XBRL and semantic networks can be used to describe the subject area. A set of criteria for analysing and selecting the most appropriate knowledge representation language for the purpose of data collection on the example of financial statements is prepared. The results of service development are described and the application of XBRL format is shown. The multi-agent approach to modelling and design of information systems was used in the development of the service.

    Keywords: data mining, subject area model, data formats, XBRL, business process, service, data integration

  • Spatio-temporal structure of metageosystems in regional geoportals

    The article is devoted to the solution of the problem of forming mechanisms for ensuring sustainable functioning of natural and man-made risks management processes in territorially distributed organizational systems determined by both the spatio-temporal structure of the enclosing natural landscapes and the characteristics of man-made impact. The article presents and substantiates the position that regional geographic information systems and geoportals should play a key role in the risk management system. The example of the regional GIS "Mordovia" and the geoportals "Metageosystems of Mordovia. Spatial data of the region" and "Natural and cultural heritage of the Republic of Mordovia. Traveling with the Russian Geographical Society" shows the possibility of using an analytical approach to the analysis of geosystems of various hierarchical levels to solve the problem of risk analysis in territorially distributed organizational systems.

    Keywords: geographic information system, geoportal, risk analysis, spatial data, geosystems, geographically distributed organizational system

  • Computer vision algorithms for object recognition in low visibility conditions

    The work is devoted to the development and analysis of computer vision algorithms designed to recognize objects in conditions of limited visibility, such as fog, rain or poor lighting. In the context of modern requirements for safety and automation, the task of identifying objects becomes especially relevant. The theoretical foundations of computer vision methods and their application in difficult conditions are considered. An analysis of image processing algorithms is carried out, including machine learning and deep learning methods that are adapted to work in conditions of poor visibility. The results of experiments demonstrating the effectiveness of the proposed approaches are presented, as well as a comparison with existing recognition systems. The results of the study can be useful in the development of autonomous vehicles and video surveillance systems.

    Keywords: computer vision, mathematical modeling, software package, machine learning methods, autonomous transport systems

  • Visualization and comparison of semantic trees reflecting the component structure of the patented device

    This paper describes approaches to visualization and comparison of semantic trees reflecting the component structure of the patented device and the connections between them using graph databases. DBMS data uses graph structures to store, process, and represent data. The main elements of a graph database are nodes and edges, which, within the framework of the task, model entities of 3 types (SYSTEM, COMPONENT, ATTRIBUTE) and 5 types of connections (PART-OF, LOCATED-AT, CONNECTED-WITH, ATTRIBUTE-FOR, IN-MANNER-OF). According to the results of the study, it can be stated that Neo4j demonstrates the best possibilities for graph visualization; ArangoDB, despite correctly entered queries, performs incomplete visualization; AllegroGraph showed difficult work with code, difficult configuration of graph tree visualization. 3 algorithms for comparing graph representations of information have been tested: Graph Edit Distance, Topological Comparison, Subgraph Isomorphism. The algorithms are implemented in python, compares 2 graph trees, displays visualization and analysis of common graph structures and differences.

    Keywords: semantic tree, component structure, patent, graph databases, Neo4j, AllegroGraph, ArangoDB