×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Development of a comprehensive system for evaluating the performance of a construction design organization with the introduction of KPIs

    The current problems of the construction industry are considered and an algorithm for the introduction of modern, flexible management methodologies to improve the efficiency of the management process in construction design organizations is proposed, as well as a variant of an integrated efficiency assessment system taking into account KPIs is being developed.

    Keywords: construction, design organizations, KPIs, flexible management, Agile, Lean manufacturing, stakeholders, efficiency

  • A fuzzy comparison method for managing the functioning of an organizational system

    Modern digitalization processes involve the use of intelligent systems at key stages of information processing. Given that the data available for intelligent analysis in organizational systems are often fuzzy, there is a problem of comparing the corresponding units of information with each other. There are several known methods for such a comparison. In particular, for random fuzzy variables with known distribution laws, the degree of coincidence of these distribution laws can be used as a criterion for the correspondence of one random variable to another. However, this approach does not have the necessary flexibility required to solve practical problems. The approach we propose allows you to compare fuzzy, fuzzy and clear, as well as clear and clear data. The paper will provide an example illustrating this approach. The material presented in the study was initially focused on managing organizational systems in education. However, its results can be extended to other organizational systems.

    Keywords: fuzzy data, weakly structured problems, comparison criteria, hierarchy analysis method, systems analysis, fuzzy benchmarking

  • Development of a wave energy conversion system for powering marine buoys using piezoelectric elements

    This study investigates the integration of piezoelectric elements with marine buoys for the purpose of utilising wave energy in autonomous marine devices. The buoy system was subjected to controlled wave conditions during testing, resulting in a peak voltage of 5.6 V and a maximum power of 40 microW. The findings indicate the viability of the system in powering low-power marine equipment. The integration of piezoelectric elements into marine buoy systems offers a cost-effective hybrid solution, making it a promising power source for powering buoys and sensors in remote offshore environments.

    Keywords: wave energy conversion, sea waves, piezoelectric elements, wave height, wavelength

  • Creating a dataset of russian texts for emotion analysis using Robert Plutchik's model

    The purpose of research is to increase the level of specification of sentiment within the framework of sentiment analysis of Russian-language texts by developing a dataset with an extensive set of emotional categories. The paper discusses the main methods of sentimental analysis and the main emotional models. A software system for decentralizing data tagging has been developed and described. The novelty of this work lies in the fact that to determine the emotional coloring of Russian-language texts, an emotional model is used for the first time, which contains more than 8 emotional classes, namely the model of R. Plutchik. As a result, a new dataset was developed for the study and analysis of emotions. This dataset consists of 24,435 unique records labeled into 32 emotion classes, making it one of the most diverse and detailed datasets in the field. Using the resulting dataset, a neural network was trained that determines the author’s set of emotions when writing text. The resulting dataset provides an opportunity for further research in this area. One of the promising tasks is to enhance the efficiency of neural networks trained on this dataset.

    Keywords: sentiment, analysis, model, Robert Plutchik, emotions, markup, text

  • A mathematical model of a fault-tolerant nonlinear conversion unit for OFDM wireless systems with frequency hopping

    With the development of low-orbit satellite Internet systems (NSIS), issues of ensuring effective operation in conditions of intentional interference come to the fore. One of the solutions is related to the use of systems using both OFDM methods and generators implementing frequency hopping (HF). Obviously, the more complex the algorithm for selecting operating frequencies, the more efficient the operation of the microwave. In the article, it is proposed to use the SPN cipher "Grasshopper" as a generator for selecting operating frequencies. As a result, the CCF system will have a high resistance to calculating operating frequency numbers by electronic warfare systems. However, failures and failures may occur during the operation of the SSC. To prevent their consequences, it is proposed to implement an SPN cipher using polynomial modular codes of residue classes (PMCC). One of the transformations in the "Grasshopper" is a nonlinear transformation that performs the substitution operation. It is obvious that the creation of a new mathematical model for performing a nonlinear transformation using MCCS will ensure the operation of the SPN-cipher-based RF generator in conditions of failures and failures.

    Keywords: low-orbit satellite Internet systems, the Grasshopper SPN cipher, nonlinear transformations, modular codes of residue classes, mathematical model, fault tolerance, frequency hopping, polynomial modular code of residue classes

  • Development of a technique for automated control of the gloss of chocolate bars based on machine vision for automation of cooling and molding processes

    The article presents a technique for automated control of the gloss of chocolate bars based on machine vision, integrated into the functional scheme of automation of cooling and molding processes. The key factors affecting gloss are considered, existing control methods are analyzed and the need for continuous objective quality assessment is substantiated. To optimize the process, a digital simulation has been created in the R-PRO environment, which allows simulating various technological modes. The developed image processing algorithms calculate quantitative gloss values and form feedback with the control system, adjusting key production parameters. The proposed approach improves the accuracy of control, reduces the volume of defects and reduces the time for debugging equipment, creating conditions for the further development of full automation in the chocolate factory.

    Keywords: chocolate, surface gloss, automation, machine vision, quality control, cooling and molding, digital simulation

  • Prediction of the solubility of a substance in supercritical fluids based on machine learning

    The present study aims to explore the methodologies employed in practice to ascertain the parameters of processes occurring in supercritical fluid media. A primary focus of this investigation lies in the solubility of key components of the system in supercritical fluid solvents, with a view to understanding the limitations of mathematical models in qualitatively predicting solubility outside the investigated ranges of values. This analysis seeks to elucidate the potential challenges and opportunities in conducting experimental studies in this domain. However, within the domain of supercritical fluid technologies, the optimization of processes and the prediction of their properties is attainable through the utilization of models and machine learning methodologies, leveraging both accumulated experimental and calculated data. The present study is dedicated to the examination of this approach, encompassing the consideration of system input parameters, solvent properties, solute properties, and the designated output parameter, solubility. The findings of the present study demonstrate the efficacy of this approach in predicting the solubility process through machine learning.

    Keywords: supercritical fluids, solubility of substances, solubility factors, solubility prediction, machine learning, residue analysis, feature importance analysis

  • Optimization of the composition of fast-hardening heavy cement concretes with a complex additive from industrial waste

    Optimization of the composition of heavy cement concretes modified with a complex additive based on industrial waste (alumina-containing component - aluminum slag (ASH), spent molding mixture (OFS) using the PlanExp B-D13 software package is a three-factor planned experiment, according to the criteria: compressive strength on the 2nd and 28th days of hardening.

    Keywords: heavy cement concretes, fast-hardening concretes, optimization, experiment planning, strength indicators, industrial waste, spent molding mixture, aluminum slag

  • Calibration of a magnetometer in an integrated navigation system of a small-class unmanned underwater vehicle

    The article is devoted to the development and implementation of a two-stage magnetometer calibration algorithm integrated into the navigation system of a small-class unmanned underwater vehicle. At the first stage, an ellipsoidal approximation method is used to compensate for soft iron and hard iron distortion, ensuring the correct geometric location of magnetometer measurements. The second stage of calibration involves a method for estimating rotation between the coordinate systems of the magnetometer and accelerometer using quaternions as rotation parameters. Experimental verification of the algorithm demonstrated its effectiveness. Following completion of the two-step calibration, calibration parameters were determined and their use confirmed good consistency between magnetometer readings and actual magnetic field data, indicating the feasibility of using this technique for calibrating magnetometers.. The proposed algorithm for two-stage magnetometer calibration does not require laboratory equipment and can be carried out under real-world operating conditions. This makes it possible to integrate it into the onboard software of unmanned underwater vehicles.

    Keywords: calibration, magnetometer, accelerometer, MEMS sensor, AHRS, navigation system, unmanned underwater vehicle, ellipsoid approximation, quaternion, magnetic inclination

  • One of the Approaches to Analyzing Source Code in Student Projects

    When evaluating student work, the analysis of written assignments, particularly the analysis of source code, becomes particularly relevant. This article discusses an approach for evaluating the dynamics of feature changes in students' source code. Various metrics of source code are analyzed and key metrics are identified, including quantitative metrics, program control flow complexity metrics, and the TIOBE quality indicator. A set of text data containing program source codes from a website dedicated to practical programming, was used to determine threshold values for each metric and categorize them. The obtained results were used to conduct an analysis of students' source code using a developed service that allows for the evaluation of work based on key features, the observation of dynamics in code indicators, and the understanding of a student's position within the group based on the obtained values.

    Keywords: machine learning, text data analysis, program code analysis, digital footprint, data visualization

  • Sufficient condition for stability of a dynamic system of stabilization of equilibrium position and programmed motions

    In the article, based on the estimate of the Euclidean norm of the deviation of the coordinates of the transition and stationary states of the dynamic system, the compression condition of the generalized projection operator of the dynamic system with restrictions is derived. From the principle of contracting mappings, taking into account the derived compression condition of the projection operator, estimates are obtained for the sufficient condition for the stability of the dynamic system of stabilization of the equilibrium position and program motions. The obtained estimates generalize the previously obtained results. Ensuring the stability of the operator of a limited dynamic system is demonstrated experimentally.

    Keywords: sufficient condition for stability, projection operator, stabilization of equilibrium position. stabilization of program motions, SimInTech

  • Evaluation of the complexity of algorithms for building a dominator tree in the context of the implementation of algorithms for data flow analysis in the implementation of the frontend compiler of the Solidity programming language

    This article discusses two of the most popular algorithms for constructing dominator trees in the context of static code analysis in the Solidity programming language. Both algorithms, the Cooper, Harvey, Kennedy iterative algorithm and the Lengauer-Tarjan algorithm, are considered effective and widely used in practice. The article compares these algorithms, evaluates their complexity, and selects the most preferable option in the context of Solidity. Criteria such as execution time and memory usage were used for comparison. The Cooper, Harvey, Kennedy iterative algorithm showed higher performance when working with small projects, while the Lengauer-Tarjan algorithm performed better when analyzing larger projects. However, overall, the Cooper, Harvey, Kennedy iterative algorithm was found to be more preferable in the context of Solidity as it showed higher efficiency and accuracy when analyzing smart contracts in this programming language. In conclusion, this article may be useful for developers and researchers who are involved in static code analysis in the Solidity language, and who can use the results and conclusions of this study in their work.

    Keywords: dominator tree, Solidity, algorithm comparison

  • Engineering preparation of a construction site in cramped conditions of dense development

    A characteristic feature of urban construction is an increase in the density of buildings and, accordingly, the tightness of working conditions. The need to ensure the safety of existing buildings and reduce negative impacts on the urban environment requires the development of rational solutions at the stage of organizational and technological design during construction in conditions of dense development. The factors of constraint affect already at the stage of preparation of the construction site. The purpose of the study is to select and justify the methods of work during the removal of engineering networks from the construction site. The existing methods of network re-routing and technology comparison are considered. An example of a construction object shows a variant of an organizational and technological solution for the removal of networks using trenchless technologies.

    Keywords: engineering training, construction site, dense buildings, cramped conditions, trenchless laying of networks

  • Development of an algorithm and software implementation of a module for correlation and regression analysis and visualization of user activity

    The purpose of the article is a software implementation of a module for analyzing the activity of site users based on a heat map of clicks, compatible with domestic web services, for example, combining the functionality of correlation and regression analysis and visualization in the form of dashboards before and after making changes to site elements. All functionality is carried out directly in the web analytics service. Based on the data obtained on the analyzed site element, a decision is made to adjust the design and/or content to increase the click rate. Thus, the proposed solution allows us to expand the functionality of the web analytics service and reduce labor costs. The software module has been successfully tested. As a result of the analysis and making the necessary adjustments to the site, the click rate increased

    Keywords: user activity, correlation and regression analysis, dashboard, program module, trend line, coefficient of determination

  • Modeling the Random Forest Machine Learning Algorithm Using the Mathematical Apparatus of Petri Net Theory

    The article considers the possibility of modeling the random forest machine learning algorithm using the mathematical apparatus of Petri net theory. The proposed approach is based on the use of three types of Petri net extensions: classical, colored nets, and nested nets. For this purpose, the paper considers the general structure of decision trees and the rules for constructing models based on a bipartite directed graph with a subsequent transition to the random forest machine learning algorithm. The article provides examples of modeling this algorithm using Petri nets with the formation of a tree of reachable markings, which corresponds to the operation of both decision trees and a random forest.

    Keywords: Petri net, decision tree, random forest, machine learning, Petri net theory, bipartite directed graph, intelligent systems, evolutionary algorithms, decision support systems, mathematical modeling, graph theory, simulation modeling