Disclosure of Invention
The invention provides a multithreading-based streaming media task processing method and a corresponding device, which can achieve dynamic and differential loan approval task assignment, support rapid adjustment of the multithreading-based streaming media task processing strategy to assign cases with different risk levels to different assignment objects, and solve the problems that in the prior art, differential configuration cannot be achieved, and expandability is poor.
The invention also provides computer equipment and a readable storage medium of the method for processing the multithreading streaming media task.
In order to solve the problems, the invention adopts the following technical scheme:
in a first aspect, the present invention provides a method for processing a multithread-based streaming media task, where the method includes:
receiving a task execution request of a streaming media task, wherein the task execution request comprises a target task to be executed;
extracting task information in the target task, creating a thread pool by utilizing a pre-created thread according to the task information, and setting corresponding environment variables for each thread in the thread pool;
and dispatching the target task to a thread matched with the target task in the thread pool according to a preset dynamic balance scheduling strategy, and executing the target task by the thread under the environment variable of the thread.
Specifically, the creating a thread pool by using a pre-created thread according to the task information, and setting a corresponding environment variable for each thread in the thread pool includes:
determining the maximum thread quantity according to the number of the CPU cores and creating threads with corresponding quantity;
creating the thread pool by using a ThreadPooleExecutor class according to the task information, wherein the parameters of the constructor of the ThreadPooleExecutor class at least comprise: the number of the core threads, the maximum number of the threads and a task queue for storing the tasks to be executed.
Specifically, the dispatching the target task to the thread matched with the target task in the thread pool according to a preset dynamic balance scheduling policy includes:
counting the mapping relation between each current thread and the task currently executed by each thread according to a preset task counting function, and recording the number of the tasks currently executed by each thread;
and distributing the target task to the thread with the least number of currently executed tasks according to the statistical result of the task statistical function.
Preferably, the method further comprises the following steps:
interrupting a currently running thread, capturing and storing state information of an interrupted task execution node;
creating a new thread on the new task execution node;
and restoring the operation of the new thread by using the saved state information.
Specifically, the thread pool further includes a saturation processing policy, and the dispatching the target task to the thread in the thread pool, which is matched with the target task, according to a preset dynamic balance scheduling policy further includes:
and when the number of the received target tasks exceeds the maximum thread number range and the task queue capacity, processing the exceeding target tasks by utilizing the saturation strategy.
Specifically, the processing of the exceeded target task by using the saturation policy includes:
when the number of the received target tasks exceeds the maximum thread number range and the task queue capacity, a new task is not executed, an exception is directly thrown out, and the thread pool is prompted to be full;
or, the new task is not executed, and the exception is not thrown out;
alternatively, the first task in the message queue is replaced with the new task currently in progress for execution.
Preferably, the dispatching the target task to the thread matched with the target task in the thread pool according to a preset dynamic balance scheduling policy further includes:
and when the number of the received target tasks does not exceed the maximum thread number range and the task queue capacity, setting a distributed lock for the streaming media data in the target tasks.
In a second aspect, the present invention provides a multithread-based streaming media task processing apparatus, including:
the system comprises a receiving module, a processing module and a processing module, wherein the receiving module is used for receiving a task execution request of a streaming media task, and the task execution request comprises a target task to be executed;
the extraction module is used for extracting task information in the target task, creating a thread pool by utilizing pre-created threads according to the task information, and setting corresponding environment variables for each thread in the thread pool;
and the dispatching module is used for dispatching the target task to a thread matched with the target task in the thread pool according to a preset dynamic balance dispatching strategy, and executing the target task by the thread under the environment variable of the thread.
In a third aspect, the present invention provides a computer-readable storage medium, wherein a computer program is stored on the computer-readable storage medium, and when being executed by a processor, the computer program implements the steps of the method for processing a multithread-based streaming media task according to any one of the first aspect.
In a fourth aspect, the present invention provides a computer device, comprising a memory and a processor, wherein the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, cause the processor to execute the steps of the method for processing a multithread-based streaming media task according to any one of the claims in the first aspect.
Compared with the prior art, the technical scheme of the invention at least has the following advantages:
1. the invention provides a processing method of a streaming media task based on multithreading, which comprises the steps of receiving a task execution request of the streaming media task, wherein the task execution request comprises a target task to be executed; extracting task information in the target task, creating a thread pool by utilizing a pre-created thread according to the task information, and setting corresponding environment variables for each thread in the thread pool; and dispatching the target task to a thread matched with the target task in the thread pool according to a preset dynamic balance scheduling strategy, and executing the target task by the thread under the environment variable of the thread. The invention realizes the task thread mode of changing single thread into multiple threads, the environment variable of each thread is no longer a global variable, the packaging and the use of the Dll library are possible, the performance and the stability of simultaneous multi-channel data processing are improved, and the customer experience is better.
2. The invention hangs the thread pool and the environment group to the server class, adds a dynamic balance scheduling strategy for the server and locks key data of the server. The invention adds a thread label for each environment variable, so that each thread executes the task under the environment variable of the thread, and adds any statistical function for the task scheduler, so as to count the number of the executed tasks of each thread, thereby facilitating the subsequent dispatching of the received target task to a proper thread.
3. The method comprises the steps of capturing state information of an interrupted task execution node and storing the state information by interrupting a currently running thread; creating a new thread on the new task execution node; and restoring the operation of the new thread by using the stored state information to realize thread migration.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention.
In some of the flows described in the present specification and claims and in the above-described figures, a number of operations are included that occur in a particular order, but it should be clearly understood that these operations may be performed out of order or in parallel as they occur herein, with the order of the operations being numbered, e.g., S11, S12, etc., merely to distinguish between various operations, and the order of the operations by themselves is not meant to imply any order of execution. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. As used herein, the term "and/or" includes all or any element and all combinations of one or more of the associated listed items.
It will be understood by those of ordinary skill in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, wherein the same or similar reference numerals denote the same or similar elements or elements having the same or similar functions throughout. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a method for processing a streaming media task based on multiple threads, where the method needs to include participation of a user, a mobile terminal, and a server, and the server is an execution subject of the embodiment of the present invention to execute and implement the functions of the method, as shown in fig. 1, the method includes the following steps:
s11, receiving a task execution request of the streaming media task, wherein the task execution request comprises a target task to be executed.
When the server is started, the CPU core number of the system hardware is judged firstly, and the maximum thread number is determined according to the CPU core number. The maximum number of threads is the maximum number of threads of the specified thread pool. Generally, the cpu can be well utilized when the program thread number is equal to two to three times of the cpu thread number, and excessive program thread number does not improve performance, but is affected by frequent switching among threads, and the number of the thread number needs to be continuously adjusted according to service consideration of thread processing to determine the optimal thread number of the current system.
In the embodiment of the present invention, the video stream task execution request includes, but is not limited to, a connection establishment request, a code stream receiving request, and a data processing request, and correspondingly, the target task at least includes connection establishment, code stream receiving, and data processing.
In the embodiment of the invention, after receiving the video stream task execution request, the server acquires a target task in the video stream task execution request, wherein the target task comprises task information of task quantity, task execution mode and execution time.
And S12, extracting task information in the target task, creating a thread pool by utilizing pre-created threads according to the task information, and setting corresponding environment variables for each thread in the thread pool.
In a possible design, the present invention preferably creates a thread pool according to the task information as follows:
determining the maximum thread quantity according to the number of the CPU cores and creating threads with corresponding quantity;
creating the thread pool by using a ThreadPooleExecutor class according to the task information, wherein the parameters of the constructor of the ThreadPooleExecutor class at least comprise: the number of the core threads, the maximum number of the threads and a task queue for storing the tasks to be executed. The number of the core threads is that a thread pool is specified, and a plurality of threads are operated. The maximum number of threads will take effect only when the message queue is full and no task can be added. In the embodiment of the invention, the threads with the corresponding number are determined and created according to the number of the tasks in the task message to execute the target task.
In the embodiment of the invention, a plurality of threads and task queues are specifically created when the thread pool is created, and the task queues are used for storing tasks which are not processed, so that a buffer mechanism is provided.
In the embodiment of the present invention, the specific scheme when the thread pool executes the target task is as follows:
(1) judging whether the number of the currently running threads exceeds the number of the core threads, if not, creating a thread to directly execute the target task;
(2) if the number of running threads exceeds or equals to the number of core threads, adding the target task into a task queue;
(3) if the task queue is full, checking whether the number of the currently running threads is less than the maximum number of the threads, and if so, creating a thread to directly execute the task;
(4) and if the task queue is full and the number of threads currently running is greater than or equal to the maximum number of threads, processing the exceeding target task by utilizing the saturation strategy.
In the embodiment of the invention, when the thread is created, the independent environment variables of each thread are correspondingly created. And the independent environment variables of each thread are local variables. And when the environment variable is created, acquiring identification information of a corresponding thread, such as a thread ID, as a label of the environment variable, adding the label to the environment variable, and executing a specified task under the environment variable corresponding to the label according to the label when a subsequent thread executes the specified task. The invention solves the problem that when one path of code stream exits and another path of code stream exits in the prior art, the global variable in the Live555 is damaged, so that the problem occurs in the packaged dll library.
In one possible design, the present invention specifically creates 1 main thread + N sub-threads when creating the multiple threads. The main thread is mainly responsible for checking whether the server needs to be closed or not, recording log state information and printing information; the sub-thread is mainly responsible for implementing the execution of the target task, such as equipment access connection, audio and video transmission and the like.
In the embodiment of the invention, after the server receives the task, whether the task quota is over-limit is verified, and if the task quota is over-limit, the server cannot process the current task. And if the task quota is not over-limit, the server can process the current task, and then the distributed lock is set. The invention adopts a distributed lock mode to protect the accessed resources, prevents a certain resource from being accessed by a plurality of threads, sets a distributed lock for the accessed resource and forbids other threads to access after the certain resource is accessed by one thread.
And S13, dispatching the target task to a thread matched with the target task in the thread pool according to a preset dynamic balance scheduling strategy, and executing the target task by the thread under the environment variable of the thread.
In the embodiment of the invention, a dynamic balance scheduling strategy corresponding to the thread pool, data sharing management among threads and the like are created. And the dynamic balance scheduling strategy is to distribute the newly received streaming media task execution request to a proper thread according to the number of tasks currently executed by each thread. Specifically, the dynamic balance scheduling policy is not simply polling, that is, the dynamic balance scheduling policy is evenly distributed to each thread for execution, but the dynamic balance scheduling policy is adopted, that is, scheduling is performed according to the number of tasks being executed on the thread, when the tasks are executed each time, the number of tasks is increased by 1, the number of tasks after execution is decreased by 1, and the thread with the smallest number of tasks receives the next streaming media task execution request.
The invention presets a task statistic function. The task counting function is used for counting the number of tasks currently executed by each thread, the thread by which each target task is currently executed and the current execution state. When the thread executes a task, the number of the tasks is added with 1, when the thread finishes executing the task, the number of the tasks is subtracted with 1, and the thread with the minimum number of the tasks receives the next streaming media task execution request
According to the method, the mapping relation between each current thread and the task being executed by each thread is counted according to a preset task counting function, and the number of the tasks being executed by each thread is recorded; and distributing the target task to the thread with the least number of tasks currently executed by utilizing the preset dynamic balance scheduling strategy according to the statistical result of the task statistical function.
In one embodiment, the invention can also interrupt the currently running thread, capture the state information of the interrupted task execution node and store the state information; creating a new thread on the new task execution node; and restoring the operation of the new thread by using the stored state information to realize thread migration.
In one embodiment, when the number of the received target tasks exceeds the maximum thread number range and the task queue capacity, the exceeding target tasks are processed by the saturation strategy. Specifically, when the number of the received target tasks exceeds the maximum thread number range and the task queue capacity, no new task is executed, an exception is directly thrown out, and the thread pool is prompted to be full; or, the new task is not executed, and the exception is not thrown out; or, replacing the first task in the message queue with the current new task for execution; alternatively, execute is called directly to perform the new task that is currently in progress.
In one embodiment, when the number of the received target tasks does not exceed the maximum thread number range and the task queue capacity, a distributed lock is set for the streaming media data in the target tasks.
In the embodiment of the invention, as a multithreading operation mode is adopted, when the connection request of the client is received, the client connection is correspondingly distributed to different thread processing. Specifically, the invention enables each client to have respective environment variables according to different threads instead of global environment quantity by modifying all client sessions and client connection environment base class acquisition modes, thereby improving the processing efficiency.
Referring to fig. 3, an embodiment of the present invention further provides a processing apparatus for a multi-thread based streaming media task, and in one embodiment, the processing apparatus includes a receiving module 11, an extracting module 12, and a dispatching module 13. Wherein,
the receiving module 11 is configured to receive a task execution request of a streaming media task, where the task execution request includes a target task to be executed.
When the server is started, the CPU core number of the system hardware is judged firstly, and the maximum thread number is determined according to the CPU core number. The maximum number of threads is the maximum number of threads of the specified thread pool. Generally, the cpu can be well utilized when the program thread number is equal to two to three times of the cpu thread number, and excessive program thread number does not improve performance, but is affected by frequent switching among threads, and the number of the thread number needs to be continuously adjusted according to service consideration of thread processing to determine the optimal thread number of the current system.
In the embodiment of the present invention, the video stream task execution request includes, but is not limited to, a connection establishment request, a code stream receiving request, and a data processing request, and correspondingly, the target task at least includes connection establishment, code stream receiving, and data processing.
In the embodiment of the invention, after receiving the video stream task execution request, the server acquires a target task in the video stream task execution request, wherein the target task comprises task information of task quantity, task execution mode and execution time.
And the extraction module 12 is configured to extract task information in the target task, create a thread pool by using a pre-created thread according to the task information, and set a corresponding environment variable for each thread in the thread pool.
In a possible design, the present invention preferably creates a thread pool according to the task information as follows:
determining the maximum thread quantity according to the number of the CPU cores and creating threads with corresponding quantity;
creating the thread pool by using a ThreadPooleExecutor class according to the task information, wherein the parameters of the constructor of the ThreadPooleExecutor class at least comprise: the number of the core threads, the maximum number of the threads and a task queue for storing the tasks to be executed. The number of the core threads is that a thread pool is specified, and a plurality of threads are operated. The maximum number of threads will take effect only when the message queue is full and no task can be added. In the embodiment of the invention, the threads with the corresponding number are determined and created according to the number of the tasks in the task message to execute the target task.
In the embodiment of the invention, a plurality of threads and task queues are specifically created when the thread pool is created, and the task queues are used for storing tasks which are not processed, so that a buffer mechanism is provided.
In the embodiment of the present invention, the specific scheme when the thread pool executes the target task is as follows:
(5) judging whether the number of the currently running threads exceeds the number of the core threads, if not, creating a thread to directly execute the target task;
(6) if the number of running threads exceeds or equals to the number of core threads, adding the target task into a task queue;
(7) if the task queue is full, checking whether the number of the currently running threads is less than the maximum number of the threads, and if so, creating a thread to directly execute the task;
(8) and if the task queue is full and the number of threads currently running is greater than or equal to the maximum number of threads, processing the exceeding target task by utilizing the saturation strategy.
In the embodiment of the invention, when the thread is created, the independent environment variables of each thread are correspondingly created. And the independent environment variables of each thread are local variables. And when the environment variable is created, acquiring identification information of a corresponding thread, such as a thread ID, as a label of the environment variable, adding the label to the environment variable, and executing a specified task under the environment variable corresponding to the label according to the label when a subsequent thread executes the specified task. The invention solves the problem that when one path of code stream exits and another path of code stream exits in the prior art, the global variable in the Live555 is damaged, so that the problem occurs in the packaged dll library.
In one possible design, the present invention specifically creates 1 main thread + N sub-threads when creating the multiple threads. The main thread is mainly responsible for checking whether the server needs to be closed or not, recording log state information and printing information; the sub-thread is mainly responsible for implementing the execution of the target task, such as equipment access connection, audio and video transmission and the like.
In the embodiment of the invention, after the server receives the task, whether the task quota is over-limit is verified, and if the task quota is over-limit, the server cannot process the current task. And if the task quota is not over-limit, the server can process the current task, and then the distributed lock is set. The invention adopts a distributed lock mode to protect the accessed resources, prevents a certain resource from being accessed by a plurality of threads, sets a distributed lock for the accessed resource and forbids other threads to access after the certain resource is accessed by one thread.
And the dispatching module 13 is configured to dispatch the target task to a thread in the thread pool, where the thread is matched with the target task, according to a preset dynamic balance scheduling policy, and execute the target task by the thread under its own environment variable.
In the embodiment of the invention, a dynamic balance scheduling strategy corresponding to the thread pool, data sharing management among threads and the like are created. And the dynamic balance scheduling strategy is to distribute the newly received streaming media task execution request to a proper thread according to the number of tasks currently executed by each thread. Specifically, the dynamic balance scheduling policy is not simply polling, that is, the dynamic balance scheduling policy is evenly distributed to each thread for execution, but the dynamic balance scheduling policy is adopted, that is, scheduling is performed according to the number of tasks being executed on the thread, when the tasks are executed each time, the number of tasks is increased by 1, the number of tasks after execution is decreased by 1, and the thread with the smallest number of tasks receives the next streaming media task execution request.
The invention presets a task statistic function. The task counting function is used for counting the number of tasks currently executed by each thread, the thread by which each target task is currently executed and the current execution state. When the thread executes a task, the number of the tasks is added with 1, when the thread finishes executing the task, the number of the tasks is subtracted with 1, and the thread with the minimum number of the tasks receives the next streaming media task execution request
According to the method, the mapping relation between each current thread and the task being executed by each thread is counted according to a preset task counting function, and the number of the tasks being executed by each thread is recorded; and distributing the target task to the thread with the least number of tasks currently executed by utilizing the preset dynamic balance scheduling strategy according to the statistical result of the task statistical function.
In one embodiment, the invention can also interrupt the currently running thread, capture the state information of the interrupted task execution node and store the state information; creating a new thread on the new task execution node; and restoring the operation of the new thread by using the stored state information to realize thread migration.
In one embodiment, when the number of the received target tasks exceeds the maximum thread number range and the task queue capacity, the exceeding target tasks are processed by the saturation strategy. Specifically, when the number of the received target tasks exceeds the maximum thread number range and the task queue capacity, no new task is executed, an exception is directly thrown out, and the thread pool is prompted to be full; or, the new task is not executed, and the exception is not thrown out; or, replacing the first task in the message queue with the current new task for execution; alternatively, execute is called directly to perform the new task that is currently in progress.
In one embodiment, when the number of the received target tasks does not exceed the maximum thread number range and the task queue capacity, a distributed lock is set for the streaming media data in the target tasks.
In the embodiment of the invention, as a multithreading operation mode is adopted, when the connection request of the client is received, the client connection is correspondingly distributed to different thread processing. Specifically, the invention enables each client to have respective environment variables according to different threads instead of global environment quantity by modifying all client sessions and client connection environment base class acquisition modes, thereby improving the processing efficiency.
In another embodiment, an embodiment of the present invention provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the method for processing a multithreading-based streaming media task according to any one of the above technical solutions. The computer-readable storage medium includes, but is not limited to, any type of disk including floppy disks, hard disks, optical disks, CD-ROMs, and magneto-optical disks, ROMs (Read-Only memories), RAMs (Random AcceSS memories), EPROMs (EraSable Programmable Read-Only memories), EEPROMs (Electrically EraSable Programmable Read-Only memories), flash memories, magnetic cards, or optical cards. That is, a storage device includes any medium that stores or transmits information in a form readable by a device (e.g., a computer, a cellular phone), and may be a read-only memory, a magnetic or optical disk, or the like.
The computer-readable storage medium provided by the embodiment of the invention can receive a task execution request of a streaming media task, wherein the task execution request comprises a target task to be executed; extracting task information in the target task, creating a thread pool by utilizing a pre-created thread according to the task information, and setting corresponding environment variables for each thread in the thread pool; and dispatching the target task to a thread matched with the target task in the thread pool according to a preset dynamic balance scheduling strategy, and executing the target task by the thread under the environment variable of the thread. The invention realizes the task thread mode of changing single thread into multiple threads, the environment variable of each thread is no longer a global variable, the packaging and the use of the Dll library are possible, the performance and the stability of simultaneous multi-channel data processing are improved, and the customer experience is better.
Further, in still another embodiment, the present invention provides a computer apparatus, as shown in fig. 3, including a processor 303, a memory 305, an input unit 307, and a display unit 309. Those skilled in the art will appreciate that the structural elements shown in fig. 3 do not constitute a limitation of all computer devices and may include more or fewer components than those shown, or some of the components may be combined. The memory 305 may be used to store the application 301 and various functional modules, and the processor 303 executes the application 301 stored in the memory 305, thereby performing various functional applications of the device and data processing. The memory 305 may be an internal memory or an external memory, or include both internal and external memories. The memory may comprise read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), flash memory, or random access memory. The external memory may include a hard disk, a floppy disk, a ZIP disk, a usb-disk, a magnetic tape, etc. The disclosed memory includes, but is not limited to, these types of memory. The memory 305 disclosed herein is provided as an example and not a limitation.
The input unit 307 is used for receiving input of signals and receiving keywords input by a user. The input unit 307 may include a touch panel and other input devices. The touch panel can collect touch operations of a user on or near the touch panel (for example, operations of the user on or near the touch panel by using any suitable object or accessory such as a finger, a stylus and the like) and drive the corresponding connecting device according to a preset program; other input devices may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., play control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like. The display unit 309 may be used to display information input by a user or information provided to the user and various menus of the computer device. The display unit 309 may take the form of a liquid crystal display, an organic light emitting diode, or the like. The processor 303 is a control center of the computer device, connects various parts of the entire computer using various interfaces and lines, and performs various functions and processes data by operating or executing software programs and/or modules stored in the memory 303 and calling data stored in the memory. The one or more processors 303 shown in fig. 3 are capable of executing, implementing the functions of, the receiving module 11, the extracting module 12, and the dispatching module 13 shown in fig. 3.
In one embodiment, the computer device includes a memory 305 and a processor 303, the memory 305 stores computer readable instructions, and the computer readable instructions, when executed by the processor, cause the processor 303 to execute the steps of the method for processing a multithreading-based streaming media task according to the above embodiment.
The embodiment of the invention provides computer equipment which can receive a task execution request of a streaming media task, wherein the task execution request comprises a target task to be executed; extracting task information in the target task, creating a thread pool by utilizing a pre-created thread according to the task information, and setting corresponding environment variables for each thread in the thread pool; and dispatching the target task to a thread matched with the target task in the thread pool according to a preset dynamic balance scheduling strategy, and executing the target task by the thread under the environment variable of the thread. The invention adopts a multi-thread task processing mode, realizes reasonable assignment of target tasks by using a dynamic balance scheduling strategy, adopts local variables as environment variables, provides possibility for packaging and using a Dll library, improves the performance and stability of simultaneous multi-path data processing, and has better customer experience.
The computer-readable storage medium provided in the embodiment of the present invention can implement the above-mentioned embodiment of the processing method for a multi-thread-based streaming media task, and for specific function implementation, reference is made to the description in the embodiment of the method, which is not described herein again.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a Read-Only Memory (ROM), or a Random Access Memory (RAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.