Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail and completely with reference to the following specific embodiments of the present application and the accompanying drawings. It should be apparent that the described embodiments are only some of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Aiming at the technical problem of low webpage making efficiency, the embodiment of the application provides a solution, and the basic idea is as follows: the concept of the shared editing area is introduced into the webpage making platform, and an online communication function is added for users belonging to the same shared editing area. The terminal equipment of the users in the same shared editing area can display a shared editing interface for webpage production; responding to a communication trigger event of the shared editing interface, and acquiring an input message; providing the message to the server device; the server-side equipment can provide the message to other terminals of the shared editing area, timely communication among users is realized in the webpage making process, the users can timely communicate the making or modifying direction of the webpage in the webpage making process, developers can timely modify webpage contents, and webpage making efficiency is improved.
The technical solutions provided by the embodiments of the present application are described in detail below with reference to the accompanying drawings.
It should be noted that: like reference numerals refer to like objects in the following figures and embodiments, and thus, once an object is defined in one figure or embodiment, further discussion thereof is not required in subsequent figures and embodiments.
Fig. 1 is a schematic structural diagram of an information processing system according to an embodiment of the present application. As shown in fig. 1, the system includes: a plurality of terminal devices 11 and a server device 12. Wherein, a plurality means 2 or more.
The server device 12 and the terminal device 11 may be connected wirelessly or by wire. Optionally, the service-side device 12 may be communicatively connected to the terminal device 11 through a mobile network, and accordingly, the network format of the mobile network may be any one of 2G (gsm), 2.5G (gprs), 3G (WCDMA, TD-SCDMA, CDMA2000, UTMS), 4G (LTE), 4G + (LTE +), 5G, WiMax, and the like. Alternatively, the server device 12 may also be communicatively connected to the terminal device 11 through bluetooth, WiFi, infrared, or the like.
In this embodiment, the terminal device 11 is a terminal device used by a user and having functions of computing, accessing internet, communicating and the like required by the user, and may be, for example, a smart phone, a tablet computer, a personal computer, a wearable device and the like.
In this embodiment, the server device 12 is a computer device capable of responding to the service request of the terminal device 11 and providing a service related to web page creation for the user, and generally has the capability of undertaking and securing the service. The server device 12 may be a single server device, a cloud server array, or a Virtual Machine (VM) running in the cloud server array. In addition, the server device may also refer to other computing devices with corresponding service capabilities, such as a terminal device (running a service program) such as a computer.
Alternatively, the server device 12 may maintain a web page production platform or system, and the terminal device 11 may access the web page production platform by accessing a link of the web page production platform or system. The web page may be an H5 web page.
In the present embodiment, a plurality of terminal apparatuses 11 serve the same shared editing area (for convenience of description, the shared editing area is defined as a target shared editing area). The shared editing area is used for making a web page, and a plurality of terminal devices 11 serving the same shared editing area can collaboratively edit the web page currently being made in the target shared editing area.
Wherein each user has at least one terminal device 11. In the embodiment of the present application, a specific implementation manner of entering the target shared editing area by the user is not limited. For example, a user of the first terminal device 11a serving the shared edit section may access the web page authoring platform by accessing a link of the web page authoring platform or system. Accordingly, the first terminal device may display a login page in response to the authoring platform or system being directed to a web page. The user of the first terminal device can enter the shared editing area through a user name, a password and the like which are registered in advance. Accordingly, the first terminal device can enter the sharing editing area in response to the login event and display the sharing editing interface. The login event can be realized as a login event generated by a triggering operation of a login control in a login page.
As shown in fig. 1, the sharing editing interface may include a sharing control, and the user may invite another user to enter the sharing editing area through the sharing control. Correspondingly, the first terminal equipment can respond to the triggering operation aiming at the sharing control to generate the identifier of the sharing editing area; alternatively, the identifier of the shared editing area may be information for uniquely identifying one shared editing area, such as an Identification (ID) of the shared editing area, a password of the shared editing area, a graphic code (such as a barcode, a two-dimensional code, etc.), a link, or the like, but is not limited thereto.
Further, for the user of the other terminal receiving the identifier of the shared editing area, the user can enter the shared editing area through the identifier of the shared editing area. In the embodiment of the application, for convenience of description and distinction, a terminal entering a shared editing area through an identifier of the shared editing area shared by a first terminal device is defined as a second terminal device. Wherein, the number of the second terminal equipment can be 1 or more. Plural means 2 or more.
Accordingly, the second terminal device may display the login page in response to an access operation to the web page production platform or system. The user of the second terminal device can input the identifier of the shared editing area through the login page. Correspondingly, the second terminal equipment can acquire the identifier of the shared editing area; and entering the shared editing area based on the identification of the shared editing area.
After the second terminal device enters the shared editing area, a shared editing page, which is mainly used for making or editing a web page, may be displayed for a plurality of terminal devices 11 serving the same shared editing area. As shown in fig. 1, the shared editing page may include: a layout component, a materials component, and an effects preview component, among others. Wherein the layout component can provide a layout template; the materials component can provide a variety of web page element materials. The effect preview component can be used for previewing the webpage effect currently made in the shared editing area by the user. Accordingly, the terminal device 11 may display the web page currently normally made in the shared editing area in response to the trigger operation for the effect preview component. The web page of fig. 1 that is currently normally produced only with the shared edit area includes: the notification bulletin, news story, and employee's home are illustrated, but not limited thereto.
In this embodiment, in order to serve communication between users corresponding to terminal devices in the same shared editing area, the terminal device 11 may further obtain an input message in response to a communication trigger event in the shared editing area. The message may reflect the production direction for the web page currently being produced by the shared edit section. The message "please increase the page of the notification announcement by a little" as shown in fig. 1; and so on.
The communication trigger event is an event that can trigger the terminal device 11 to acquire a message that the terminal device communicates with other terminal devices. The implementation form of the communication trigger event will be described in the following embodiments, and will not be described in detail here.
Further, the terminal device 11 may provide the acquired message to the server device 12. Accordingly, the server device 12 can provide the message to other terminal devices serving the same shared edit area as the terminal device providing the message. The number of other terminal devices is 1 or more. Multiple refers to 2 or more than 2, but less than the total number of serving the same shared edit area. Accordingly. The other terminal device may receive the message and output the message. The communication of the user of the plurality of terminal devices 11 with respect to the creation direction of the web page can be realized with respect to the message that can reflect the creation direction of the web page currently being created with respect to the shared editing area. The specific implementation of the message output by other terminal devices will be described in the following embodiments, and will not be described in detail here.
In this embodiment, a concept of a shared editing area is introduced on a web page production platform, and an online communication function is added for users belonging to the same shared editing area. The terminal equipment of the users in the same shared editing area can display a shared editing interface for webpage production; responding to a communication trigger event of the shared editing interface, and acquiring an input message; providing the message to the server device; the server-side equipment can provide the message to other terminals of the shared editing area, timely communication among users is realized in the webpage making process, the users can timely communicate the making or modifying direction of the webpage in the webpage making process, developers can timely modify webpage contents, and webpage making efficiency is improved.
In the embodiment of the present application, the terminal device providing the message may be any terminal device serving the same shared editing area. The plurality of terminal apparatuses 11 have the same function. In the following embodiments, the information processing method provided in the embodiments of the present application is described by taking the first terminal device 11a as an example. The first terminal device 11a is a terminal device that provides a message among the plurality of terminal devices 11, and may be any terminal device among the plurality of terminal devices 11. For convenience of description and distinction, the other terminal devices of the plurality of terminal devices 11, except for the first terminal device 11, are defined as the second terminal device 11 b. The number of the second terminal devices 11b may be 1 or more, and the number thereof is determined by the total number of the plurality of terminal devices 11.
In this embodiment, the first terminal device 11, when acting as a message sender, may obtain an input message in response to a communication trigger event of the shared editing interface. The specific implementation form of the communication triggering event is not limited. The following is an exemplary description in connection with several alternative embodiments.
Embodiment A: the sharing editing interface comprises: a message input control. Accordingly, the communication trigger event may be implemented as a communication trigger event generated for a touch operation of the message input control. For the first terminal device 11a, in response to a communication trigger event generated by a touch operation for the message input control, the input data may be acquired, and the input data may be packaged as a message and provided to the server device 12. Wherein the message may be a text message or a voice message. The message may reflect the direction of production for the web page currently being produced by the shared edit section.
In some embodiments, the first terminal device 11a may provide text message input functionality. The message input control may be implemented as a text input control. For the first terminal device 11a, in response to a text communication trigger event generated by a touch operation for the text input control, the input text data may be acquired and encapsulated as a text message.
In other embodiments, the first terminal device 11a may also provide a voice input function. The message input control may be implemented as a voice input control. Accordingly, the communication trigger event may be implemented as a voice communication trigger event generated by a touch operation for the voice input control. For the first terminal device 11a, in response to a voice communication trigger event generated by a touch operation for the voice input control, the input sound data may be acquired, and the acquired sound data may be encapsulated as a voice message. The specific implementation mode is as follows: the first terminal device 11a responds to a voice communication trigger event generated by touch operation of a voice input control aiming at the shared editing interface, and starts a microphone of the first terminal device 11a to pick up sound; and in response to a voice communication stop event, controlling the microphone to stop picking up sound; acquiring at least sound data picked up before the microphone stops picking up sound as input data; and encapsulates at least sound data picked up before the microphone stops picking up sound as a voice message.
In this embodiment, a specific implementation of the voice communication stop event is not limited. The following is an exemplary description of the manner in which the voice input control is activated.
Embodiment a 1: the activation mode of the voice input control provided by the first terminal device 11a is a point touch mode. The user may trigger or activate the voice input control by clicking on the voice input control. In such an embodiment, the first terminal device 11a may also provide a stop voice input control. Accordingly, the user may click the stop voice input control, ending the voice input. Accordingly, the first terminal device 11a may start a microphone to pick up sound in response to a touch operation for the voice input control. Further, when the user stops the voice input, the voice input stopping control can be triggered to end the voice input. Accordingly, the voice communication stop event may be implemented as a voice communication stop event generated for stopping the touch operation of the voice input control. Accordingly, the first terminal apparatus 11a may control the microphone to stop sound pickup in response to a voice communication stop event generated for a touch operation of the stop language input control. Alternatively, the stop voice input control and the voice input control may be implemented as different controls. Alternatively, the stop voice input control and the voice input control may be implemented as the same control. The control is defaulted to be a voice input control, and the state of the control is defaulted to be not subjected to voice input. Accordingly, when the control is triggered, the state of the control is adjusted to initiate voice input. Accordingly, after the state of the control is adjusted to initiate voice input, the control is implemented as a stop voice input control. For the voice input control, if the voice input control is in a default state, an event generated by the triggering operation aiming at the voice input control is a voice communication triggering event; if the voice input control is after the voice input is started, the event generated by the triggering operation aiming at the voice input control is a voice communication stop event.
Embodiment a 2: in order to further improve the user experience and prevent the voice message from being too long and causing poor user experience, a duration threshold (defined as the first duration) may be set for the pickup duration of the voice message. And when the time length of the voice message input by the user reaches a set time length threshold value, automatically ending the voice input. In this embodiment, a specific value of the first duration is not limited. Alternatively, the first duration may be 1min, 50s, 30s, or the like. Accordingly, the first terminal device 11a may also time the sound pickup time of the microphone in response to the touch operation for the voice input control, and determine that the voice communication stop event occurs and control the microphone to stop sound pickup in a case where the sound pickup time of the microphone reaches a set time length (first time length).
Embodiment a 3: the activation mode of the voice input control provided by the first terminal device 11a is a long press mode. The user may trigger or activate the voice input control by long-pressing the voice input control. Namely, voice is input by long pressing the voice input control, and the voice input control is released when the voice input is stopped. Accordingly, the first terminal device 11a may start a microphone of the first terminal device 11a to pick up sound in response to a voice communication trigger event generated by a touch operation for the voice input control; and in response to a voice communication stop event generated by stopping the triggering operation for the voice input control, the control microphone stops picking up sound. In embodiment a3, the first terminal device 11a may determine that a voice communication stop event occurs when it is detected that the voice input control is released.
Embodiment a 4: in order to further improve the user experience and prevent the voice message from being too long and causing poor user experience, a duration threshold (defined as a second duration) may be set for the pickup duration of the voice message. Wherein the second duration has the same function as the first duration. When the duration of inputting the voice message by the user reaches the set second duration, the first terminal device 11a automatically ends the voice input even if the user presses the voice input control. In this embodiment, a specific value of the third duration is not limited, and a size relationship between the second duration and the first duration is not limited. The second time period may be 1min, 50s, 30s, or the like. Correspondingly, the first terminal device 11a may also time the sound pickup time of the microphone in response to the touch operation for the voice input control, and determine that the voice communication stop event occurs and control the microphone to stop sound pickup when the sound pickup time of the microphone reaches the set second duration.
In the embodiment of the application, in order to prevent the misoperation of a user on the voice input control, the pickup time length of a microphone is also timed; if the sound pickup duration of the microphone is greater than or equal to the set duration (recorded as the third duration), when the voice communication stop event occurs, at least the sound data picked up by the microphone before the microphone stops picking up sound is acquired, and at least the sound data picked up by the microphone before the microphone stops picking up sound is packaged into a message and provided to the server device 12. Optionally, the third duration is less than the first duration and the second duration. Wherein, the third duration should not be too long. The third duration may be 1s, 800ms, 500ms, or the like.
In still other embodiments, the first terminal device 11a may provide text input functionality and voice input functionality. The message input control may be implemented as a text input control and a voice input control. It is possible for the user to autonomously select whether to employ text input or speech input.
The second terminal device 11b may receive the message provided by the first terminal device 11a and sent by the server device 13, and output the message, so as to implement communication between users serving the same shared editing area in the web page creation scenario. For the second terminal device 11b, a sharing editing interface may be displayed, and an identification of the received message may be displayed on the sharing editing interface. The identifier of the message may be a message frame corresponding to the message.
Optionally, for a text message, the identification of the message may include: user identification (e.g., a nickname) and text message content. For voice messages, the message identification may include: user identification, duration of voice message, sound wave identification and the like. For voice messages that the user does not listen to, an unvoiced reminder message may also be displayed. For example, a small red dot or the like is displayed near the identification of the non-listened to voice message. For the display of the user identifier, reference may be made to the related content of the text message, which is not described herein again. Further, after the user triggers a certain voice message, the terminal device may respond to the touch operation for the voice message to play the voice message.
Further, after the communication of the users of the plurality of terminal devices 11 with respect to the production direction of the web page is completed or is completed in stages, the developer of the web page may edit or modify the web page based on the communication contents of the users of the plurality of terminal devices with respect to the production direction of the web page. Correspondingly, any terminal equipment can respond to the editing operation aiming at the webpage and acquire webpage content related to the editing operation; editing or modifying the webpage based on the webpage content associated with the editing operation; the editing operation for the web page is sent out based on communication contents of users of the plurality of terminal devices for the production direction of the web page.
The webpage making system provided by the embodiment of the application not only provides a real-time online communication function, but also can provide a multi-user collaborative development function. To prevent resource conflicts, the web page elements that different users may perform editing operations are different at the same time. For a plurality of terminal apparatuses 11 serving the same shared editing area, web page content associated with an editing operation for a web page element may be acquired in response to the editing operation for the web page element; editing or modifying the webpage elements based on the webpage contents related to the editing operation aiming at the webpage elements; the web page elements in which the plurality of terminal apparatuses 11 perform editing operations at the same time are different.
Alternatively, the first terminal device 11a may provide the identification of the selected target web page element to the server device 12 in response to the selection operation for the web page element. Accordingly, the server device 12 may match the identifier of the target webpage element with the identifier of the webpage element in the editing state corresponding to the shared editing area; if the matching is successful, it is determined that the target webpage element is currently edited by the other terminal device 11b corresponding to the shared editing area. Optionally, the server device 12 may return a prompt to the first terminal device 11a that the web page element is being edited by other users, so as to prompt the user of the first terminal device 11a to edit the other web page elements.
Accordingly, if the identifier of the target web page element is not matched with the identifier of the web page element in the editing state corresponding to the shared editing area, it is determined that the target web page element is not currently edited by the other terminal device 11b corresponding to the shared editing area. The server device 12 may allow the first terminal device 11a to edit the target web page element. Alternatively, the server device 12 may authorize the first terminal device 11a to edit the target web page element, and the like. In this way, the user of the first terminal device 11a can edit the target web page element. Further, the server device 12 may add the identifier of the target webpage element to the identifier of the webpage element in the editing state corresponding to the shared editing area, so as to prevent other users from repeatedly editing the target webpage element at the same time, and prevent resource conflicts and the like.
In addition to the system embodiments described above, the embodiments of the present application also provide an information processing method, and the information processing method provided by the embodiments of the present application is exemplarily described below from the perspective of a terminal device.
Fig. 2 is a schematic flowchart of an information processing method according to an embodiment of the present application. The method is adapted to a first terminal, as shown in fig. 2, and includes:
201. and displaying a sharing editing interface for webpage production.
202. Responding to a communication trigger event of a shared editing interface, and acquiring an input message; wherein the message may reflect a production direction for a web page currently being produced by the shared edit section.
203. And providing the message to the server-side equipment so that the server-side equipment provides the message to a second terminal which serves the same shared editing area as the first terminal equipment.
The terminal device may be any terminal serving the same shared editing area, and for convenience of description, is defined as a first terminal.
Fig. 3 is a schematic flowchart of another information processing method according to an embodiment of the present application. The method is adapted to the second terminal, as shown in fig. 3, and includes:
301. and displaying a sharing editing interface for webpage production.
302. Receiving messages from other terminals provided by the server side equipment; other terminals and the second terminal serve the same shared editing area; wherein the message may reflect a production direction for a web page currently being produced by the shared edit section.
303. And displaying the identification of the message on the sharing editing interface.
In this embodiment, the first terminal is any one of a plurality of terminals serving the same shared editing area, and serves as a message sending terminal. The second terminal serves as a message receiving terminal for other terminals in the same shared editing area with the first terminal.
Wherein each user has at least one terminal device 11. In the embodiment of the present application, a specific implementation manner of entering the target shared editing area by the user is not limited. For example, a user of any terminal serving the shared edit section may access the web authoring platform by accessing a link to the web authoring platform or system. Accordingly, the terminal may display a landing page in response to the authoring platform or system being directed to a web page. The user of the first terminal device can enter the shared editing area through a user name, a password and the like which are registered in advance. Accordingly, the shared edit area may be entered and the shared edit interface displayed in response to a login event. The login event can be realized as a login event generated by a triggering operation of a login control in a login page.
The sharing editing interface can comprise a sharing control, and the user can invite other users to enter the sharing editing area through the sharing control. Correspondingly, the terminal can respond to the triggering operation aiming at the sharing control to generate the identifier of the sharing editing area; alternatively, the identifier of the shared editing area may be information for uniquely identifying one shared editing area, such as an Identification (ID) of the shared editing area, a password of the shared editing area, a graphic code (such as a barcode, a two-dimensional code, etc.), a link, or the like, but is not limited thereto.
Further, for the user of the other terminal receiving the identifier of the shared editing area, the user can enter the shared editing area through the identifier of the shared editing area. In the embodiment of the application, for convenience of description and distinction, a terminal entering a shared editing area through an identifier of the shared editing area shared by a first terminal is defined as a second terminal. Wherein, the number of the second terminals can be 1 or more. Plural means 2 or more.
Accordingly, the second terminal may display the login page in response to an access operation for the web page production platform or system. The user of the second terminal can input the identifier of the shared editing area through the login page. Correspondingly, the second terminal can acquire the identifier of the shared editing area; and entering the shared editing area based on the identification of the shared editing area.
After the second terminal enters the shared editing area, for a plurality of terminal devices serving the same shared editing area, instep 201 and step 301, a shared editing page may be displayed, and the shared editing page is mainly used for making or editing a web page.
In this embodiment, in order to facilitate communication between users corresponding to terminal devices in the same shared editing area, instep 202, the input message may be further acquired in response to a communication trigger event in the shared editing area. The message may reflect the production direction for the web page currently being produced by the shared edit section.
The communication trigger event refers to an event which can trigger the terminal device to acquire a message communicated with other terminal devices. The implementation form of the communication trigger event will be described in the following embodiments, and will not be described in detail here.
Further, instep 203, the obtained message may be provided to the server device. Accordingly, the server device can provide the message to other terminal devices serving the same shared edit area as the terminal device providing the message. The number of other terminal devices is 1 or more. Multiple refers to 2 or more than 2, but less than the total number of serving the same shared edit area. Accordingly. The other terminal device may receive the message instep 302 and display the identity of the message on the shared editing interface instep 303. The communication of the users of the plurality of terminal devices with respect to the production direction of the web page can be realized for the message which can reflect the production direction of the web page currently produced in the shared editing area. For specific implementation of the identifier of the output message of other terminal devices, reference may be made to related contents of the above system embodiment, which are not described herein again.
In this embodiment, a concept of a shared editing area is introduced on a web page production platform, and an online communication function is added for users belonging to the same shared editing area. The terminal equipment of the users in the same shared editing area can display a shared editing interface for webpage production; responding to a communication trigger event of the shared editing interface, and acquiring an input message; providing the message to the server device; the server-side equipment can provide the message to other terminals of the shared editing area, timely communication among users is realized in the webpage making process, the users can timely communicate the making or modifying direction of the webpage in the webpage making process, developers can timely modify webpage contents, and webpage making efficiency is improved.
In the embodiment of the present application, the terminal device providing the message may be any terminal device serving the same shared editing area. A plurality of terminal devices have the same function. In the following embodiments, the information processing method provided in the embodiments of the present application is described by taking the first terminal as an example.
In this embodiment, when the first terminal is used as a message sender, the first terminal may obtain an input message in response to a communication trigger event of the shared editing interface. The specific implementation form of the communication triggering event is not limited. The following is an exemplary description in connection with several alternative embodiments.
Embodiment A: the sharing editing interface comprises: a message input control. Accordingly, the communication trigger event may be implemented as a communication trigger event generated for a touch operation of the message input control. For the first terminal, the input data can be acquired in response to a communication trigger event generated by touch operation aiming at the message input control, and the input data is packaged into a message and provided for the server-side equipment. Wherein the message may be a text message or a voice message. The message may reflect the direction of production for the web page currently being produced by the shared edit section.
In some embodiments, the first terminal may provide text message entry functionality. The message input control may be implemented as a text input control. For the first terminal, in response to a text communication trigger event generated by a touch operation for the text input control, acquiring input text data, and packaging the input text data into a text message.
In other embodiments, the first terminal may also provide voice input functionality. The message input control may be implemented as a voice input control. Accordingly, the communication trigger event may be implemented as a voice communication trigger event generated by a touch operation for the voice input control. For the first terminal, in response to a voice communication trigger event generated by a touch operation for the voice input control, the input voice data can be acquired, and the acquired voice data can be packaged into a voice message. The specific implementation mode is as follows: the method comprises the steps that a first terminal responds to a voice communication trigger event generated by touch operation of a voice input control aiming at a shared editing interface, and a microphone of the first terminal is started to pick up sound; and in response to a voice communication stop event, controlling the microphone to stop picking up sound; acquiring at least sound data picked up before the microphone stops picking up sound as input data; and encapsulates at least sound data picked up before the microphone stops picking up sound as a voice message. For a specific implementation of the voice communication stop event, reference may be made to the related contents of the above system embodiment, and details are not described herein.
The second terminal can receive the message provided by the first terminal and sent by the server device, output the message and realize communication between users serving the same shared editing area in a webpage making scene. For the second terminal, a sharing editing interface may be displayed, and an identification of the received message may be displayed on the sharing editing interface. The identifier of the message may be a message frame corresponding to the message.
Further, after the communication of the users of the plurality of terminal devices with respect to the production direction of the web page is completed or is completed in stages, the developer of the web page may edit or modify the web page based on the communication content of the users of the plurality of terminal devices with respect to the production direction of the web page. Correspondingly, any terminal can respond to the editing operation aiming at the webpage and acquire webpage content related to the editing operation; editing or modifying the webpage based on the webpage content associated with the editing operation; the editing operation for the web page is sent out based on communication contents of users of the plurality of terminal devices for the production direction of the web page.
The embodiment of the application can provide a real-time online communication function and a multi-person collaborative development function. To prevent resource conflicts, the web page elements that different users may perform editing operations are different at the same time. For a plurality of terminals serving the same shared editing area, acquiring webpage content related to the editing operation of the webpage elements in response to the editing operation of the webpage elements; editing or modifying the webpage elements based on the webpage contents associated with the editing operation aiming at the webpage elements; the webpage elements of the plurality of terminal devices executing the editing operation at the same time are different. For a specific implementation of how to ensure that the web page elements of the terminal devices execute the editing operation at the same time are different, reference may be made to the related contents of the above system embodiment, and details are not described herein again.
It should be noted that the execution subjects of the steps of the methods provided in the above embodiments may be the same device, or different devices may be used as the execution subjects of the methods. For example, the execution subjects ofsteps 201 and 202 may be device a; for another example, the execution subject ofstep 201 may be device a, and the execution subject ofstep 202 may be device B; and so on.
In addition, in some of the flows described in the above embodiments and the drawings, a plurality of operations are included in a specific order, but it should be clearly understood that the operations may be executed out of the order presented herein or in parallel, and the sequence numbers of the operations, such as 201, 202, etc., are merely used for distinguishing different operations, and the sequence numbers do not represent any execution order per se. Additionally, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel.
Accordingly, embodiments of the present application also provide a computer-readable storage medium storing computer instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the above-mentioned information processing methods.
Fig. 4 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 4, the terminal device includes: amemory 40a, aprocessor 40b, acommunication component 40c and a screen 40 d; thememory 40a is used for storing computer programs.
In this embodiment, the terminal device may serve as a message sending end. Accordingly, theprocessor 40b is coupled to thememory 40a, thecommunication component 40c and the screen 40d for executing computer programs for: displaying a shared editing interface for web page production on the screen 40 d; responding to a communication trigger event of the shared editing interface, and acquiring an input message; and providing the message to the server device through thecommunication component 40c, so that the server device provides the message to a second terminal serving the same shared editing area as the terminal device; wherein the message may reflect a production direction for a web page currently being produced by the shared edit section.
In some embodiments, the sharing editing interface comprises: a message input control. Theprocessor 40b, when acquiring the input message, is specifically configured to: and responding to a communication trigger event generated by touch operation of a first message input control aiming at the shared editing interface, acquiring input data, and packaging the acquired data into a message.
Optionally, the terminal device includes:audio component 40 e. Theaudio component 40e may include:microphone 40e 1. Accordingly, the message input control includes: a voice input control. Theprocessor 40b, when acquiring the input data, is specifically configured to: in response to a voice communication trigger event generated by touch operation of a voice input control of the shared editing interface, starting a microphone 40e1 of the terminal equipment to pick up sound; in response to the voice communication stop event, the microphone 40e1 is controlled to stop picking up sound; and acquires sound data picked up by the microphone 40e1 before the sound pickup is stopped as input data.
Correspondingly, when encapsulating the acquired data into a message, theprocessor 40b is specifically configured to: sound data picked up before the microphone 40e1 stops picking up sound is packaged as a message.
Optionally, theprocessor 40b is specifically configured to, when controlling the microphone to stop picking up: in response to a voice communication stop event generated with respect to a touch operation of the stop voice input control, controlling the microphone 40e1 to stop picking up sound; alternatively, in response to a voice communication stop event generated by stopping the touch operation for the voice input control, the microphone 40e1 is controlled to stop picking up sound.
In other embodiments, the message input control comprises: a text entry control. Theprocessor 40b, when acquiring the input data, is specifically configured to: and acquiring input text data as input data in response to a text communication trigger event generated by touch operation aiming at the text input control.
Correspondingly, when encapsulating the acquired data into a message, theprocessor 40b is specifically configured to: the input text data is encapsulated into a message.
In the embodiment of the present application, theprocessor 40b is further configured to: responding to the editing operation aiming at the webpage, and acquiring webpage content related to the editing operation; editing or modifying the webpage based on the webpage content associated with the editing operation; the editing operation aiming at the webpage is sent out based on the communication content of the users of the first terminal and the second terminal aiming at the production direction of the webpage.
Optionally, theprocessor 40b is further configured to: responding to a login event, and entering a shared editing area; responding to the sharing event, and generating an identifier of the sharing editing area; and provides the identification of the shared edit region to the second terminal via thecommunication component 40c for the second terminal to enter the shared edit region based on the identification of the shared edit region.
In the embodiment of the present application, theprocessor 40b is further configured to: responding to the editing operation aiming at the webpage elements, and acquiring webpage content related to the editing operation aiming at the webpage elements; editing or modifying the webpage elements based on the webpage contents associated with the editing operation aiming at the webpage elements; the webpage elements of the first terminal and the webpage elements of the second terminal which execute the editing operation at the same time are different.
In the embodiment of the present application, the terminal device may also serve as a message receiving end. Accordingly, theprocessor 40b is further configured to: displaying a shared editing interface for web page production on the screen 40 d; receiving messages from other terminals provided by the server side equipment through thecommunication component 40 c; other terminals and the terminal equipment serve the same shared editing area; displaying the identification of the message on a sharing editing interface; wherein the message may reflect a production direction for a web page currently being produced by the shared edit section.
In some optional embodiments, as shown in fig. 4, the terminal device may further include: power supply assembly 40f, and the like. Only some of the components are schematically shown in fig. 4, and it does not mean that the terminal device must include all of the components shown in fig. 4, nor that the terminal device can include only the components shown in fig. 4.
The terminal device provided in this embodiment introduces the concept of a shared editing area on a web page creation platform, and adds an online communication function for a user serving the same shared editing area. The terminal equipment of the users in the same shared editing area can display a shared editing interface for webpage production; responding to a communication trigger event of the shared editing interface, and acquiring an input message; providing the message to the server device; the server-side equipment can provide the message to other terminals of the shared editing area, timely communication among users is realized in the webpage making process, the users can timely communicate the making or modifying direction of the webpage in the webpage making process, developers can timely modify webpage contents, and webpage making efficiency is improved.
In embodiments of the present application, the memory is used to store computer programs and may be configured to store other various data to support operations on the device on which it is located. Wherein the processor may execute a computer program stored in the memory to implement the corresponding control logic. The memory may be implemented by any type or combination of volatile or non-volatile memory devices, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
In the embodiments of the present application, the processor may be any hardware processing device that can execute the above described method logic. Alternatively, the processor may be a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), or a Micro Controller Unit (MCU); programmable devices such as Field-Programmable Gate arrays (FPGAs), Programmable Array Logic devices (PALs), General Array Logic devices (GAL), Complex Programmable Logic Devices (CPLDs), etc. may also be used; or Advanced Reduced Instruction Set (RISC) processors (ARM), or System On Chips (SOC), etc., but is not limited thereto.
In embodiments of the present application, the communication component is configured to facilitate wired or wireless communication between the device in which it is located and other devices. The device in which the communication component is located can access a wireless network based on a communication standard, such as WiFi, 2G or 3G, 4G, 5G or a combination thereof. In an exemplary embodiment, the communication component receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component may also be implemented based on Near Field Communication (NFC) technology, Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, or other technologies.
In the embodiment of the present application, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation.
In embodiments of the present application, a power supply component is configured to provide power to various components of the device in which it is located. The power components may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the device in which the power component is located.
In embodiments of the present application, the audio component may be configured to output and/or input audio signals. For example, the audio component includes a Microphone (MIC) configured to receive an external audio signal when the device in which the audio component is located is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signal may further be stored in a memory or transmitted via a communication component. In some embodiments, the audio assembly further comprises a speaker for outputting audio signals. For example, for devices with language interaction functionality, voice interaction with a user may be enabled through an audio component, and so forth.
It should be noted that, the descriptions of "first", "second", etc. in this document are used for distinguishing different messages, devices, modules, etc., and do not represent a sequential order, nor limit the types of "first" and "second" to be different.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an example of the present application and is not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.