Categories
Case introduction Tool Development

A First Look at Online Interactive Performance Technology

The epidemic in early 2020 swept the world, and the performance industry was greatly impacted. Many musicians and entertainment companies began to explore online concert methods. Compared with offline performances, there are many differences in the form of online live broadcasts. How to use technology to bring the relationship between audiences and performers has become a new topic. In order to better enhance the audience’s sense of participation, the students of INSA HdF have some new ideas. More than 20 graduate students in the audio and video field from the school formed a group called Tempos Event and decided to add some interactive parts to the online concert. The author serves as the head of interactive technology in this group.

1. Interactive Design

As an experimental interactive performance project, the interactive design of this project is not complicated. Specifically, when the audience watching the live broadcast appears in a specific camera position, clicking on a specific area on the screen will cause the background effect to change and the lighting and video to change during the live broadcast. To give a simple example, whether the whole party will officially start from the warm-up program depends on whether a sufficient proportion of interactive users click on a specific area in the live screen in the first interactive scene. When the number of clicks meets a certain percentage, the burst animation and lighting effects rendered in real time on the background will ignite the live broadcast. There will also be different forms of interaction in the subsequent links, as well as the results of the interaction.

2. Technical realization

The technology of the interactive part mainly includes three parts: the front-end, the middle layer and the back-end. The front-end refers to the online live page and the interactive elements in the page; the back-end refers to the software for real-time rendering and the software for playing common materials. ; The middle layer is the bridge connecting the front end and the back end, how to trigger the corresponding real-time rendering according to the interactive design plan of the creative director.

2.1 Analysis of front-end technology

The online live broadcast page is mainly composed of HTML5+CSS3+JavaScript technology. Except for the necessary explanatory text, the main part is the video playback window of the embedded third-party live broadcast platform, and the interactive elements suspended on the upper layer of the window. Since the rendering of interactive elements is closely related to the live content, and different interactive elements need to be rendered in different programs, Ajax asynchronous data response is essential. The viewer’s browser will request the server for the current cue point mark at a fixed rate. When the cue point changes, it means that a new program has started or an interactive session in the same program starts or ends. At this time, the browser requests new interactive elements from the server and renders them on top of the live video. During the actual live broadcast, these cue points are manually updated by the creative team based on the progress of the live performance.

When the audience clicks or hovers above the interactive element, when the trigger conditions are met, the audience will be given a direct trigger feedback animation on the browser screen, and at the same time, a successful trigger record will be added to the MySQL database of the website through Ajax. Trigger records from numerous users are processed by data analysis programs within the website and generate statistics for each interaction. At this point, the work of the website at the front end of the system is over, and the statistical data will be acquired by the middle layer.

2.2 Back-end technology analysis

The back-end part of this project mainly refers to the realization of video and various audio lighting effects. The part of the video is divided into two categories, one is the real-time rendering video using vvvv software, and the other is the video material produced in advance played through the Millumin software. Four projectors were used on the scene, projecting on the three facades of the curtain and the white scene in front of it, and using the built-in Mapping function of the Millumin software to model and project the curtain area and the white scene. The video rendered in real time by the vvvv software is used as an input source of Millumin, and this software method is used to replace the hardware control system like Barco Event Master E2 and the player system like WatchOut.

vvvv is an open source real-time rendering system that builds a real-time rendering workflow in a graphical way, and is an artist-friendly way to build real-time rendering programs. It can import external sources, such as video or sequence frames, or create images with its own particle, gravity, or fluid systems. In this project, most of the real-time rendering images related to interactive data required by the creative team used the software’s own particle system.

2.3 Analysis of middle layer technology

The middle layer is completely developed from scratch. The front end has mature web page and framework technology, and the back end has formed software. How to open up the communication between the website and professional software is the key work of the middle layer. At the beginning of the project, having an open control protocol (such as MIDI/OSC, etc.) was the bottom line set by the interactive technical team, so the creative team considered whether it could support receiving external control when choosing video control software and real-time rendering software. After selecting vvvv and Millumin software, the interactive technology team studied the Open Sound Control protocol (ie OSC) received by these two software. Considering that vvvv and Millumin are software running on Windows and macOS respectively, the interactive technology team considered platform compatibility issues during development, and created a series of gadgets that send OSC protocols based on Qt’s cross-platform technology. We enumerated all the OSC commands that may be involved in controlling vvvv and Millumin software, and made a simple manual, asking the creative team to use self-developed gadgets for control testing.

Control test

After the control test is completed, it means that the communication between the middle layer and the back end is open. Considering that the data in the front section is based on the network, there is a certain risk of failure, so we give priority to the development of the manual control terminal – that is, manual access to participation The number of interaction and the number of clicked interaction elements are displayed one by one on the middle layer control software. Subsequently, item-by-item interaction statistics in JSON format are generated through the website and read at a fixed frequency in the middle-tier software, with manual and automatic switching functions. Under the premise of ensuring security, the online front-end and offline back-end can be connected through the self-developed middle-layer software.

middle-layer system during performance

Want to learn more? Contact us !

Categories
ArtTech Case introduction

Xiangyan Tournesol – Technical Analysis (1/3)

First exhibition: March 2020, CO-EXISTENCE.S, GALERIE COMMUNE, Tourcoing, France[more]

Intro:
Xiangyan Tournesol is an interactive art installation created by contemporary artist Xiangyan. As technical support, Mingcong was mainly responsible for the complex technical implementation of the work. When different numbers of viewers approach from different directions, stop to watch, and explain the sunflower, the works will show different reactions. This article will introduce the work from three key technical perspectives.

  • Holographic Display Technology

It should be noted that the holographic display technology here refers to the hologram illusion based on light reflection, which is similar to Pepper’s ghost. This technology is currently mostly used in studio lifts or theater performances. We installed a transparent plate made of 4 pieces of acrylic material on the top of the work to form a pyramid-like shape. This shape is widely used for lightweight hologram illusion. And a monitor was placed under this inverted pyramid so that the acrylic sheet above could reflect what was on the screen below.

The difficulty here is how to construct a reasonable 4-view video and render it in real time. To achieve this, we asked artists to model and animate in 3D software and use Unity as the real-time rendering engine. According to the appropriate artistic effect, calculate the appropriate picture output method. The final result is a three-dimensional sunflower.

Art Interactive - Sunflower

Categories
Audiovisual Conseil Case introduction

Case Review – WTA Finals 2019(2/2) – Audio & Video

In this finals, all audio signals and system control signals are transmitted by optical fiber network, among which audio main and backup signals are transmitted by DANTE network protocol, mainly considering the convenience and scalability of the system. The sound reinforcement system needs to cover the fixed seats of 12,737 people in the entire stadium and then increase the VIP area in the stadium. It needs to cover evenly and the sound pressure level is high enough.
In addition, the microphone uses the shure AD series, and uses the Quad-diversity function of AD to ensure the signal stability of such a large venue.

<< Case Review – WTA Finals 2019(1/2) – Lighting & Rigging

In addition to the sound reinforcement needs of the competition, this event also includes opening ceremony performances, entry of players and awards. Therefore, this system not only needs to meet the needs of the competition, but also has enough sound pressure level and dynamics to meet the performance needs. The German consultant team gave a more detailed design plan in the early stage, but due to the lack of detailed understanding of the site, we made adjustments according to the current situation of the site. 102 full-frequency speakers were used throughout the game, which were hung in 15 groups to meet the coverage of the audience area and the performance area of the competition area. 20 basses are used, and they are concentrated in the center of the field. This design helps the uniformity of the entire field and is consistent with the full-frequency sound.

Video System

The E2 Screen Management System raises the bar for real-time screen management, delivering superior picture quality, outstanding input and output density, massive scalability and durability. Supporting native 4K input and output, the system is the industry’s leading 4K screen management system that can manage refresh rates up to 60Hz, full 4:4:4 color sampling, and 12-bit processing.

The WATCHOUT control system edits, sorts and sorts the videos and pictures that need to be played in the event according to the overall event process, and can control and deploy the display server. (The content function of the backup machine is the same as that of the main machine, and it can be switched to the backup system in real time when the main machine has problems to ensure the smooth progress of the activities)

Create a perfect display screen effect for WTA events, show better visual effects, make the broadcast effect of the program smoother, and allow the audience to experience the high-quality picture effect brought by the point-to-point screen display.

Categories
Audiovisual Conseil Case introduction

Case Review – WTA Finals 2019(1/2) – Lighting & Rigging

The “Shiseido·Shenzhen WTA Finals” is the highest-level women’s tennis event in the world. Therefore, the requirements for lighting systems and lighting are very high.

In this finals, the entire lighting system adopts the dual network protocol with MA-NET as the main and ARTNET as the supplement.
The backbone network is formed by 5 Cisco Gigabit network switches connected in a ring. The MA-NET network protocol is used to control the lamps in the field and there are three NPU network processing units to help the console perform data calculation and transmission. In order to ensure the real-time and effective data transmission The gigabit network switches on the backbone are connected by optical fibers.
In addition, in order to ensure the normal operation of lamps in specific positions, we have added 6 network decoders to transmit signals to lamps in tricky positions. Between the gigabit network switch and the network decoder, the data transmission before the MA2 console and the gigabit network switch are professional Category 6 network cables to ensure the speed and stability of data transmission.

The difficulties encountered in the construction and maintenance of on-site lighting:
A total of more than 200 electric hoists and 1,000 meters of light TRUSS beams need to be hoisted in the entire stadium. First of all, the difficulty we encountered is the position of the lower point of the TRUSS beam. The conventional points of the infield can basically meet the design requirements, but the point where the auditorium pulls the back is a point that has never been lowered since the venue was built. , the construction is difficult and the risk is extremely high. But in the end, our team overcame all kinds of difficulties and managed to solve the problem of pulling the auditorium to the back, and then the problem of replacing lamps in the maintenance phase. Because the lighting equipment in the venue has been completed, some lamps have problems that cannot be For lowering the TRUSS rack, it needs to be repaired from the big top pulley, and the maintenance is also very difficult, but in the end, these problems are solved.

In order to ensure that the illuminance and color reproduction of the lamps meet the design standards of foreign teams, we have carried out all-round maintenance on all computer lamps and moving head LED lamps and replaced all moving head computer lamps within 20 days before the entry of the project lighting equipment. Light bulbs, maintenance parts and light bulb backups equivalent to 10% of the total number of computer lights are also prepared on site. In the end, the success of the tennis event was guaranteed.

The on-site lighting equipment finally exceeded the expectations of the German consultant team and was highly recognized by the lighting team. The whole event lasted for more than 20 days. During these more than 20 days, the entire lighting system was not turned off, and the performance was extremely stable!

Case Review – WTA Finals 2019(2/2) – Audio & Video >>

Categories
Case introduction RF Manage

How do we manage RF ?

The six-step wireless management method provides you with wireless security services.

Step1:

Perform a static RF scan of the on-site wireless environment, and arrange the frequency band where the wireless system is located to avoid the existing interference in the field

Step 2:

The design and arrangement of the antenna system are carried out according to the overall arrangement of the lighting and dance beauty to ensure the comprehensive and accurate coverage of the antenna.

Step 3:

Confirm the number and type of channels of wireless devices. All brands of wireless microphones and other types of wireless devices on site need to be registered by us for subsequent coordination.

Step4:

Wireless frequency planning and coordination, which is a very important part of RF management. A successful coordination meeting can greatly facilitate work.

Step5:

Make equipment and frequency adjustments (if necessary) during rehearsal and monitor during performances.

According to CueSheet, monitor the signal strength and audio quality of upcoming and in-use wireless microphones in advance.

Categories
Case introduction RF Manage

Honor of Kings Finals Almost encountered frequency interference? (1/2)

On February 23, the 2017 Honor of Kings(HoG) Autumn Finals were held at the “Spring Cocoon” Gymnasium of China Resources Shenzhen Bay Sports Center. As the official professional event of the highest standard of Honor of Kings, the venue was full of seats and thunderous applause. The peak showdown between QGhappy and XQ ignited the enthusiasm of the audience.

Such a high-level event site is naturally indispensable for our wireless team. A 16-channel SHURE ULXD digital wireless system and a 2-channel SHURE top-of-the-line Axient Digital were used for the host and performers. Designed by Mr. Mingcong and guarantees the smooth operation of the wireless system.

Source of On-Site Interference

At the site of this large-scale event, the wireless environment is extremely complex. LED floor screens, lighting controls, complex stage equipment and other wireless equipment have caused certain interference to our wireless microphones.

  • In-Ear Monitor (IEM)

A total of 4-channel Sennheiser IEMs are used in the scene, and there is a huge gap between the background noise of the scene in the case of full on and full off.

IEM all-off on-site spectrogram
Turn on 1 channel IEM
Turn on 4 channel IEMs

From the picture, it is not difficult to see that the IMD between these IEM channels has a great impact on the wireless environment of the scene, which seriously reduces the available frequency band of the wireless microphone.

  • Intercom system

In addition, due to the particularity of e-sports competitions, smooth communication between players is very important, so the priority of the on-site intercom system is higher than that of wireless microphones.

However, the intermodulation distortion of the intercom system greatly interferes with the frequency band where the wireless microphone is located.

400-600MHz spectrum when the intercom system is off
400-600MHz spectrum when the intercom system is on

As shown in the figure, after the wireless intercom system is turned on, the frequency to which the wireless microphone belongs is largely occupied. If the channel frequency used by the wireless microphone happens to be the intermodulation distortion frequency of the wireless intercom, the frequency drop phenomenon will occur, which further increases the difficulty of frequency planning.

  • The distance between the IEM transmitting & receiving antenna of the wireless system

The distance between the IEM transmitting antenna and the wireless system receiving antenna is different, and the impact on the noise floor of the site is also different. The closer the distance is, the higher the intensity of the noise floor of the scene, and the farther the distance is, the weaker the noise floor is.

The spectrum diagram of the distance between the transmitting antenna and the receiving antenna is 2.4 meters
The spectrum diagram of the distance between the transmitting antenna and the receiving antenna is 1.2 meters

It can be seen that at the performance site, we need to increase the distance between the ear return transmitting antenna and the wireless system receiving antenna as much as possible.

>>Honor of Kings Finals Almost encountered frequency interference? (2/2)

Categories
Case introduction RF Manage

Honor of Kings Finals Almost encountered frequency interference? (2/2)

The key to solving the frequency drop phenomenon of wireless microphones is actually to improve the signal-to-noise ratio of wireless systems. Improving the signal-to-noise ratio usually starts from two aspects.

<<Honor of Kings Finals Almost encountered frequency interference? (1/2)

  • Reduce the noise floor of the RF signal

The main reasons for the high RF noise floor of the HOG finals are the interference of the large LED screen and the intermodulation distortion of the intercom system. Due to transportation costs, the back covers of many LED large-screen circuits have been removed, so that the original electromagnetic shielding has been destroyed, resulting in strong electromagnetic interference. Therefore, we must first check whether the back is covered with a shielding layer. If not, we To wrap the circuit section with aluminum foil. Then we need to check if the LED controller has any free output ports. If so, we need to plug it in with a terminal or wrap it in aluminum foil as well.

Video output port without Terminal may transmit RF signals

Another problem that needs to be solved is the serious intermodulation distortion of the on-site intercom system. First, we need to recalculate the frequency of the wireless system when the intercom system is turned on.

Then, according to the situation on site, we can widen the distance between the intercom transmitting antennas to reduce the intensity of intermodulation distortion interference. Through the above steps, the noise floor condition of the site has been improved.

  • Increase the signal strength of the channel

First of all, we need to check whether the placement of the receiving antenna is reasonable. If it is not reasonable, we must adjust the placement of the receiving antenna to shorten the distance between the receiving antenna and the transmitting antenna as much as possible. In addition, we can use strong directional antennas such as Shure HA8089 according to scene conditions

Categories
Case introduction RF Manage

How to Cover a Cruise Ship with Wireless Microphones

China’s first “drifting multi-dimensional experience drama” “Zhiyin” was built by the famous director Fan Yue and his team in two years. It is a real cruise ship with a length of 120 meters that is completely restored to the appearance of the last century and can travel freely on the Yangtze River. , turned into a mobile theater. Such an innovative form of performance undoubtedly poses a major challenge to the transmission of wireless audio signals. We answered this challenge perfectly with the excellent digital audio and signal stability of the Shure ULX-D digital wireless system.

The prototype of the “Zhiyin” mobile theater came from the “Jianghua Wheel” of Wuhan Minsheng Company at the beginning of the last century. The director team required the wireless signal to cover the entire ship in all spaces, and at the same time to achieve good sound quality. In the face of a special pickup environment, we create an immersive scene for the scene by utilizing the spectral efficiency of the ULX-D digital wireless system, long transmission distance, and the ability to restore audio signals without distortion in relatively high RF noise. sound experience. It is the unique advantages of these digital wireless systems that can successfully help the director team realize the sound pickup requirements of this experience play.

The prototype of the “Zhiyin” mobile theater came from the “Jianghua Wheel” of Wuhan Minsheng Company at the beginning of the last century. The director team required the wireless signal to cover the entire ship in all directions, and at the same time to achieve good sound quality.

Facing the special sound pickup environment, we create an immersive sound experience for the scene by utilizing the spectral efficiency of the digital wireless system, the long transmission distance and the ability to restore the audio signal without distortion in the relatively high radio frequency noise.

Since the application environment of the wireless system is on a cruise ship, we have designed a solution to achieve seamless full coverage of the radio frequency signals in the three-layer cabin, ensuring that the wireless pockets used by any actor can be used in any one inside and outside the cabin. You can have a stable RF signal and a good audio signal in every corner. This is the first time we have done such a large-scale wireless signal coverage project on a cruise ship. The cabin is about 90 meters long and 20 meters wide. The three deck cabins cover a total area of more than 6,000 square meters. The RF cable used is about 2000 meters long.

To achieve full coverage in the cabin, we use a distributed antenna system. With such a large-scale antenna coverage, the choice of point position is actually a trade-off. It is necessary to meet the requirements that the antenna signal can cover the entire cabin, and to consider the use of feeder lines as little as possible. Even if a feeder with very low line loss is used, since the area to be covered is so large, each additional feeder will bring more losses. In the end, we carefully selected 6 points in the middle, rear and sides of the cabin on each floor to achieve a good balance between the minimum line loss and the maximum coverage area.

Categories
Audiovisual Conseil Case introduction

The application of AoIP system in Live Show

The performance scene adopts high-tech complex stage design, such as LED lifting module, water curtain imaging, holographic yarn, etc., and also includes various program forms such as ice and water projects. The environment of the performance venue is complex, how to use the AoIP system to deal with it easily?

With the theme of “Olympic City, Ice and Snow Invitation”, held at the Water Cube, the “four habitats stage” of water, land, air and ice staged an ice and snow fantasy show gathered by the world’s top artists. This audio-visual feast was recorded on December 18, 2016 and broadcast on Beijing TV on the evening of December 31.

In the system design, we have made a preliminary planning for the system from the early stage. and considered acoustically.

The front end of the audio system uses an almost all-digital approach. Including AES-3 and AoIP, we digitally connected everything with digital outputs to the mixer.

More and more wireless receivers use digital transmission or digital chips to process sound. If an analog audio cable is used to connect the signal from the wireless receiver to the digital mixer or interface box, it means that the digital signal inside the wireless receiver is First, it is converted into analog audio signal by D-A, and then A-D is converted into digital signal on the mixer or interface box. Such a process will not only introduce two D-A and A-D conversion noises, but also introduce analog transmission noise on the analog audio line, and the dynamic range of analog transmission is smaller than that of digital transmission, which will undoubtedly destroy the original characteristics of the sound. What you get on the mixer is not what the wireless receiver gets.