×

You are using an outdated browser Internet Explorer. It does not support some functions of the site.

Recommend that you install one of the following browsers: Firefox, Opera or Chrome.

Contacts:

+7 961 270-60-01
ivdon3@bk.ru

  • Development and Analysis of a Feature Model for Dynamic Handwritten Signature Recognition

    In this work, we present the development and analysis of a feature model for dynamic handwritten signature recognition to improve its effectiveness. The feature model is based on the extraction of both global features (signature length, average angle between signature vectors, range of dynamic characteristics, proportionality coefficient, average input speed) and local features (pen coordinates, pressure, azimuth, and tilt angle). We utilized the method of potentials to generate a signature template that accounts for variations in writing style. Experimental evaluation was conducted using the MCYT_Signature_100 signature database, which contains 2500 genuine and 2500 forged samples. We determined optimal compactness values for each feature, enabling us to accommodate signature writing variability and enhance recognition accuracy. The obtained results confirm the effectiveness of the proposed feature model and its potential for biometric authentication systems, presenting practical interest for information security specialists.

    Keywords: dynamic handwritten signature, signature recognition, biometric authentication, feature model, potential method, MCYT_Signature_100, FRR, FAR

  • Control algorithm for a mechatronic station for sorting products using a computer vision system

    The paper considers the issue of using a computer vision system to control the quality of products in the control algorithm of a mechatronic sorting station. Shoe products are chosen as an example. The developed system is based on machine learning methods for image recognition by segmentation. As a result, a neural network model was created, and a program was written for identifying and selecting objects using a camera for subsequent sorting of defective products. The program contains three modules: initialization for declaring all variables, models, classes, video stream from the camera; the main module, containing an internal loop for each segmented object; a subroutine for completing the work. The introduction of computer vision into the control algorithm increases the efficiency and flexibility of the quality control system, and improves the accuracy of measuring the parameters of objects for their subsequent sorting.

    Keywords: mechatronic station, sorting, computer vision, image segmentation, neural network training, control algorithm

  • Development of Software for Calculating Formation Parameters of Functional Coatings with Specified Adhesion (Case Study: Polyisobutylene-Based Coatings)

    This paper presents the results of an investigation into the adhesion properties of release coatings based on polyisobutylene applied to metallic substrates. A software tool was developed in Microsoft Visual Studio using the C++ programming language to compute the composition and effective technological parameters for forming coatings that ensure optimal adhesion to protected surfaces. As a case study, the method of calculating the relationships between composition, temperature, and formation time is demonstrated for coatings achieving the highest adhesion, corresponding to a score of “zero” on the standardized six-point cross-cut adhesion test. It is shown that the application of the developed software enables parameter evaluation within 1–2 seconds. The computational results are experimentally validated. The morphology of the coatings was examined using optical microscopy. It was observed that no delamination occurs at the intersection points of cuts or within the grid pattern.

    Keywords: coating, adhesion, microstructure, cross-cut test, polyisobutylene, optimization

  • The effect of data replacement and expansion using transformations on the recognition accuracy of the deep neural network ResNet - 50

    The article examines how the replacement of the original data with transformed data affects the quality of training of deep neural network models. The author conducts four experiments to assess the impact of data substitution in tasks with small datasets. The first experiment consists in training the model without making changes to the original data set, the second is to replace all images in the original set with transformed ones, the third is to reduce the number of original images and expand the original data set using transformations applied to images, and also in the fourth experiment, the data set is expanded in order to balance the number of images There are more in each class.

    Keywords: dataset, extension, neural network models, classification, image transformation, data replacement

  • Development of a power supply system for an organic substrate of an integrated circuit crystal with a high-speed interface at a rate of 28.25 Gbps

    The article focuses on methods for reducing high inductance in power supply circuits using one of the IC substrate topologies with a high-speed interface as an example. The interface in question operates at a speed of 28.25 Gbit/s and imposes strict requirements on the parameters of the power supply inductance. The presented solutions are aimed at ensuring low values ​​of power supply inductance in conditions of high layout density and power integrity requirements for modern data transfer interfaces.

    Keywords: power supply inductor, power system, low noise power supply, power supply impedance, analog power supply, serial interface, high speed interface, organic substrate, IC packaging

  • Calculation of the area of ​​the image of a flat region using mathematical analysis methods

    The paper proposes a method for calculating the area of a flat area from a photograph based on the use of mathematical analysis methods. To calculate the area, a curved integral of the second kind is used along a closed contour bounding the area under consideration. Defining the boundary in the form of a Bezier spline reduces the calculation of a curved integral to the calculation of several definite integrals from the Bernstein basis polynomials. An explicit form is obtained for integrals of Bernstein basis polynomials. For a third-order Bezier spline, a formula is derived for calculating the area of the area in terms of the coordinates of the reference points of the Bezier curves.

    Keywords: cubic spline, Bernstein basis polynomials. Bezier curve, Bezier spline, Green's formula, beta function, gamma function

  • Adaptive convolutional neural network for detecting safety violations

    The article presents an adaptive convolutional neural network for automated detection of safety violations in real time. Unlike existing solutions using static models, the proposed approach includes two key innovations. Automatic adaptation of model weights with a combination of stochastic and gradient descent methods. The algorithm dynamically adjusts the learning rate and the depth of parameter modification, which makes it possible to preserve previously acquired knowledge while further training on new data without degrading accuracy. Optimized context processing mechanism – the model analyzes not only objects (for example, the absence of a helmet), but also their relative location (a worker in a dangerous area without personal protective equipment), which reduces the number of false alarms. The developed system integrates computer vision, alert generation, and analytics modules, providing not only instant response to violations, but also long-term risk analysis. Experiments have confirmed a 15% increase in accuracy when working in changing lighting conditions and shooting angles.

    Keywords: convolutional neural network, information system, industrial accidents, safety, production, model training, neural network, adaptive algorithm

  • Modeling Paid Parking Occupancy: A Regression Analysis Taking into Account Customer Behavior

    The article describes the methodology for constructing a regression model of occupancy of paid parking zones taking into account the uneven distribution of sessions during the day and the behavioral characteristics of two groups of clients - the regression model consists of two equations that take into account the characteristics of each group. In addition, the process of creating a data model, collecting, processing and analyzing data, distribution of occupancy during the day is described. Also, the methodology for modeling a phenomenon whose distribution has the shape of a bell and depends on the time of day is given. The results can be used by commercial enterprises managing parking lots and city administrations, researchers when modeling similar indicators that demonstrate a normal distribution characteristic of many natural processes (customer flow in bank branches, replenishment and / or withdrawal of funds during the life of replenished deposits, etc.).

    Keywords: paid parking, occupancy, regression model, customer behavior, behavioral segmentation, model robustness, model, forecast, parking management, distribution

  • Software for calculating the surface characteristics of liquid media

    Software has been developed to evaluate the surface characteristics of liquids, solutions and suspensions in the Microsoft Visual Studio environment. The module with a user-friendly interface does not require special skills from the user and allows for a numerical calculation of the energy characteristics of the liquid in a time of ~ 1 second: adhesion, cohesion, wetting energy, spreading coefficient and adhesion of the liquid composition to the contact surface. Using the example of a test liquid - distilled water and an initial liquid separation lubricant of the Penta-100 series, an example of calculating the wetting of a steel surface with liquid media is demonstrated. Optical microscopy methods have shown that good lubrication of the steel surface ensures the formation of a homogeneous, defect-free coating. The use of the proposed module allows for an express assessment of the compatibility of liquid formulations with the protected surface and is of interest to manufacturers of paint and varnish materials in product quality control.

    Keywords: computer program, C# programming language, wetting, surface, adhesion

  • Modeling the interaction of a single abrasive grain with the surface of a part

    A review of various approaches used to model the contact interaction between the grinding wheel grain and the surface layer of the workpiece during grinding is presented. In addition, the influence of material properties, grinding parameters and grain morphology on the contact process is studied.

    Keywords: grinding, grain, contact zone, modeling, grinding wheel, indenter, micro cutting, cutting depth

  • Methods for forming quasi-orthogonal matrices based on pseudo-random sequences of maximum length

    Linear feedback shift registers (LFSR) and the pseudo-random sequences of maximum length (m-sequences) generated by them have become widely used in solving problems of mathematical modeling, cryptography, radar and communications. The wide distribution is due to their special properties, such as correlation. An interesting, but rarely discussed in the scientific literature of recent years, property of these sequences is the possibility of forming quasi-orthogonal matrices on their basis.In this paper, was conducted a study of methods for generating quasi-orthogonal matrices based on pseudo-random sequences of maximum length (m-sequences). An analysis of the existing method based on the cyclic shift of the m-sequence and the addition of a border to the resulting cyclic matrix is carried out. Proposed an alternative method based on the relationship between pseudo-random sequences of maximum length and quasi-orthogonal Mersenne and Hadamard matrices, which allows generating cyclic quasi-orthogonal matrices of symmetric structure without a border. A comparative analysis of the correlation properties of the matrices obtained by both methods and the original m-sequences is performed. It is shown that the proposed method inherits the correlation properties of m-sequences, provides more efficient storage, and is potentially better suited for privacy problems.

    Keywords: orthogonal matrices, quasi-orthogonal matrices, Hadamard matrices, m-sequences

  • Moving from a university data warehouse to a lake: models and methods of big data processing

    The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management. The article examines the transition of universities from data warehouses to data lakes, revealing their potential in processing big data. The introduction highlights the main differences between storage and lakes, focusing on the difference in the philosophy of data management. Data warehouses are often used for structured data with relational architecture, while data lakes store data in its raw form, supporting flexibility and scalability. The section ""Data Sources used by the University"" describes how universities manage data collected from various departments, including ERP systems and cloud databases. The discussion of data lakes and data warehouses highlights their key differences in data processing and management methods, advantages and disadvantages. The article examines in detail the problems and challenges of the transition to data lakes, including security, scale and implementation costs. Architectural models of data lakes such as ""Raw Data Lake"" and ""Data Lakehouse"" are presented, describing various approaches to managing the data lifecycle and business goals. Big data processing methods in lakes cover the use of the Apache Hadoop platform and current storage formats. Processing technologies are described, including the use of Apache Spark and machine learning tools. Practical examples of data processing and the application of machine learning with the coordination of work through Spark are proposed. In conclusion, the relevance of the transition to data lakes for universities is emphasized, security and management challenges are emphasized, and the use of cloud technologies is recommended to reduce costs and increase productivity in data management.

    Keywords: data warehouse, data lake, big data, cloud storage, unstructured data, semi-structured data

  • Determination of zigzag nature of vehicle trajectories

    The paper presents a method for quantitative assessment of zigzag trajectories of vehicles, which allows to identify potentially dangerous behavior of drivers. The algorithm analyzes changes in direction between trajectory segments and includes data preprocessing steps: merging of closely spaced points and trajectory simplification using a modified Ramer-Douglas-Pecker algorithm. Experiments on a balanced data set (20 trajectories) confirmed the effectiveness of the method: accuracy - 0.8, completeness - 1.0, F1-measure - 0.833. The developed approach can be applied in traffic monitoring, accident prevention and hazardous driving detection systems. Further research is aimed at improving the accuracy and adapting the method to real-world conditions.

    Keywords: trajectory, trajectory analysis, zigzag, trajectory simplification, Ramer-Douglas-Pecker algorithm, yolo, object detection

  • Tree-based diagnostic classificators of hump retarders

    The paper addresses the problem of the technical diagnostics of hump control devices, such as wagon retarders. The current analytical methods of monitoring and technical diagnostics of wagon retarder conditions are reviewed. The factors that are used in the existing diagnostics systems are analyzed and new factors to be taken into account, including specific pathway peculiarities, wagon group lengths, breaking curve styles, initial wagon group speed and environment conditions, are suggested. The suggested set of factors are characterized from the point of regression analysis. The replacement of some continuous factors with lexical ones are suggested. Decision tree-based classificators are suggested to perform the classification of hump retarder conditions. The decision tree-based classificators can be built with the means of Data Mining on a training set. An improved method of building decision trees is suggested. It’s advantage over the existing algorithms is shown on evaluation sets.

    Keywords: hump yard, wagon retarders, regression, decision trees, classification, data mining, multi-factor analysis, soft computations

  • Queuing system with mutual assistance between channels and limited dwell time

    In this paper, a new model of an open multichannel queuing system with mutual assistance between channels and limited waiting time for a request in a queue is proposed. General mathematical dependencies for the probabilistic characteristics of such a system are presented.

    Keywords: queuing system, queue, service device, mutual assistance between channels

  • On the Development of Secure Applications Based on the Integration of the Rust Programming Language and PostgreSQL DBMS

    Currently, key aspects of software development include the security and efficiency of the applications being created. Special attention is given to data security and operations involving databases. This article discusses methods and techniques for developing secure applications through the integration of the Rust programming language and the PostgreSQL database management system (DBMS). Rust is a general-purpose programming language that prioritizes safety as its primary objective. The article examines key concepts of Rust, such as strict typing, the RAII (Resource Acquisition Is Initialization) programming idiom, macro definitions, and immutability, and how these features contribute to the development of reliable and high-performance applications when interfacing with databases. The integration with PostgreSQL, which has been demonstrated to be both straightforward and robust, is analyzed, highlighting its capacity for efficient data management while maintaining a high level of security, thereby mitigating common errors and vulnerabilities. Rust is currently used less than popular languages like JavaScript, Python, and Java, despite its steep learning curve. However, major companies see its potential. Rust modules are being integrated into operating system kernels (Linux, Windows, Android), Mozilla is developing features for Firefox's Gecko engine and StackOverflow surveys show a rising usage of Rust. A practical example involving the dispatch of information related to class schedules and video content illustrates the advantages of utilizing Rust in conjunction with PostgreSQL to create a scheduling management system, ensuring data integrity and security.

    Keywords: Rust programming language, memory safety, RAII, metaprogramming, DBMS, PostgreSQL

  • Development of an integrated computer vision and gait control system for an insect-like robot

    This work is devoted to describing the development and integration of two key subsystems of an insect-like six-legged robot: a gait control module and a computer vision system for autonomous navigation. It examines architectural solutions, algorithmic foundations, and the practical implementation of components that ensure stable movement and intelligent interaction of the robot with its surroundings.

    Keywords: insect-like robot, gait control module, computer vision, autonomous navigation, ROS2, SLAM, RTABMap, NAV2, OctoMap, tripod gait, Raspberry Pi, LiDAR

  • Modeling of the drive of the solar battery orientation system under wind conditions

    The article considers a variant of constructing a model of a solar battery orientation drive based on a DC motor and PID control. Orientation in space is performed along two axes: azimuth and zenith. The model is used for optimal adjustment of PID controller parameters when processing the required orientation angles under gusty wind conditions. The following are used as the main adjustment criteria: small overshoot when processing the angle, aperiodic (non-oscillatory) nature of transient processes, minimum dynamic error in compensating for wind effects when processing the angle, minimum settling time when processing the effect. The controller was optimized using the coordinate descent method. A variant of controller adjustment for the optimal mode is given with process graphs confirming its practical optimality. The constructed drive model can be used to implement a digital twin of the solar battery panel orientation drive monitoring and control system.

    Keywords: mathematical model of the drive, PID controller, solar panel, gusty wind effects, azimuth and zenith orientation, optimization by complex criterion

  • Features of the implementation of an intelligent model for recognizing apple tree diseases by leaves

    This article discusses the implementation features of a model for recognizing apple tree diseases by leaves. In the course of the work, a number of experiments were conducted with known convolutional network architectures (ResNet50, VGG16, and MobileNet). It was found that the accuracy decreases from ResNet50 to MobileNet, respectively. The effect of changing network parameters on accuracy is considered - the number of layers, batch normalization. A description of the practical experience of building our own model is given, which was also studied for changing parameters, such as the number of hidden layers. The network shows the best result with 3 and 4 layers. The nature of the learning ability of models from the number of layers and the range of epochs is studied. In early epochs, curves with a small number of layers show a rapid increase in accuracy, and curves with a large number are trained more slowly, which is probably due to the effect of gradient attenuation. In particular, it is shown that batch normalization or network deepening can positively affect such an effect as gradient attenuation. The use of batch normalization gave a pronounced effect of improving the network in the range from 35 to 50 epochs. For clarity, the work includes graphs of neural network training. In the process of work, conclusions and recommendations were formed on how to build a network more effectively.

    Keywords: artificial intelligence, computer vision, neural networks, deep learning, machine learning

  • Analysis of existing structural solutions for in-tube inspection robots: selection of the optimal type of movement and chassis for 3D scanning of the weld relief in large diameter welded straight-seam pipes using a laser triangulation sensor

    This article provides an overview of existing structural solutions for in-line robots designed for inspection work. The main attention is paid to the analysis of various motion mechanisms and chassis types used in such robots, as well as to the identification of their advantages and disadvantages in relation to the task of scanning a longitudinal weld. Such types of robots as tracked, wheeled, helical and those that move under the influence of pressure inside the pipe are considered. Special attention is paid to the problem of ensuring stable and accurate movement of the robot along the weld, minimizing lateral displacements and choosing the optimal positioning system. Based on the analysis, recommendations are offered for choosing the most appropriate type of motion and chassis to perform the task of constructing a 3D model of a weld using a laser triangulation sensor (hereinafter referred to as LTD).

    Keywords: in-line work, inspection work, 3D scanning, welds, structural solutions, types of movement, chassis, crawler robots, wheeled robots, screw robots, longitudinal welds, laser triangulation sensor

  • Analysis of the directions of application of predictive analytics in railway transport

    The railway transport industry demonstrates significant achievements in various fields of activity through the introduction of predictive analytics. Predictive analytics systems use data from a variety of sources, such as sensor networks, historical data, weather conditions, etc. The article discusses the key areas of application of predictive analytics in railway transport, as well as the advantages, challenges and prospects for further development of this technology in the railway infrastructure.

    Keywords: predictive analytics in railway transport, passenger traffic forecasting, freight optimization, maintenance optimization, inventory and supply management, personnel management, financial planning, big data analysis

  • Simulation modeling of calculation of transient response using Duhamel integral

    A Simulink model is considered that allows calculating transient processes of objects described using a transient function for any type of input action. An algorithm for the operation of the S-function that performs calculations using the Duhamel integral is described. It is shown that due to the features of the S-function, it can store the values of the previous step of the Simulink model calculation. This allows the input signal to be decomposed into step components and the time of occurrence of each step and its value to be stored. For each step of the input signal increment, the S-function calculates the response by scaling the transient response. Then, at each step of the calculation, the sum of such reactions is found. The S-function provides a procedure for freeing memory when the end point of the transient response is reached at each step. Thus, the amount of memory required for the calculation does not increase above a certain limit, and, in general, does not depend on the length of the model time. For calculations, the S-function uses matrix operations and does not use cycles. Due to this, the speed of model calculation is quite high. The article presents the results of calculations. Recommendations are given for setting the parameters of the model. A conclusion is formulated on the possibility of using the model for calculating dynamic modes.

    Keywords: simulation modeling, Simulink, step response, step function, S-function, Duhamel integral.

  • Computer Modeling and Improvement of Extrusion Tooling 

    This paper presents a simulation of square bar extrusion using the finite element method in QForm software. The sequence of simulation stages is described and verified using an example of extruding AD31 alloy. The geometry of the extrusion tool (die) was improved to reduce material damage and eliminate profile distortion. It was found that rounding the corners and adding a calibrating section to the die provides improved performance. The analysis of metal heating from plastic deformation is also provided. The study highlights the importance of computer modeling for optimizing tools and increasing the efficiency of the metal forming process. Specifically, the initial and final die geometries were compared to assess the impact on product properties. Recommendations are provided for using simulation in the practice of developing metal forming process technologies. The benefits of using QForm software over alternative solutions are highlighted, including access to a comprehensive material database and robust technical support. The improved die design resulted in reduced plastic deformation and improved product quality. Finite element analysis significantly accelerates the development and testing of tools, providing the ability to develop optimal designs that account for various factors.

    Keywords: extrusion, die design, finite element method, computer modeling, optimization, metal forming, aluminum alloy, process simulation

  • Prospects for the Application of Vibrational Combustion of Solid Fuel in a Dual-Circuit Closed-Loop Installation to Improve the Efficiency of Agricultural Services

    This article examines the use of a dual-circuit closed-loop system for pulsation combustion of solid fuel in agricultural services. The results of research on gas self-oscillations during wood waste combustion in a Helmholtz resonator-type installation are presented. It has been established that with certain geometric characteristics of the installation, a significant sound pressure level (up to 165 dB) can be achieved in the combustion chamber while maintaining a low noise level (up to 65 dB), which contributes to increased combustion efficiency and meets environmental requirements. Potential applications of this technology are proposed, including agricultural waste utilization, drying of agricultural products, and heating of greenhouse complexes and livestock facilities.

    Keywords: Pulsation combustion, Helmholtz resonator, solid fuel, agricultural waste, energy efficiency, biomass utilization, agro-industrial complex.

  • Numerical study of the effect of dust deposits on heat transfer in porous heat exchangers

    A study of heat transfer in porous heat exchangers with a sediment of dust particles was carried out. The influence of heat exchanger length and the presence of sediment on the Nusselt number was studied. It was revealed that increasing the length of the heat exchanger from 5 to 30 mm leads to an increase in the Nusselt number by 39.72-81.35% depending on the Reynolds number. The formation of sediment on the surface of the heat exchanger leads to a decrease in the Nusselt number by 2.8-6.6%.

    Keywords: heat transfer, hydrodynamics, calculation, Nusselt number, Reynolds number, mathematical modeling, sediment formation, microelectronics, cooling systems, radiator