AI Video Creator
One of the key companies behind the Stable Diffusion image generator has launched an amazing AI video creation and editing tool that operates like a DALL-E for moving pictures. (DALL-E and DALL-E 2 are deep learning models developed by OpenAI to generate digital images from natural language descriptions or "prompts"). The new "Gen-1" video tool from Runway AI is a mark of where this technology is up to now and how quickly it is advancing. Gen-1 is not an outright text-to-video generator - it is not ChatGPT for video - you can't just ask it to go away and make a TV commercial in the style of Hitchcock. At least, not yet. Instead, it asks you for an input video, and then creates different versions from that content in response to text, image or video prompts. It means you can go and film something in rough outline fashion, just to get the basic angles, actions and camera movements done correctly, then you can ask Gen-1 to take that footage and recreate it in one or more completely different styles. You can tell it to make this a film noir kind of scene, or "make this an underwater scene" or "put these characters into a moving vehicle". Alternatively, you could find an image or video example made by somebody else that fits the style you're looking for, and then just upload it for Gen-1 to use as a style template. The software will analyse the input, work out what it is, and then do its best to infuse the key elements of your video with a similar look and feel. It is clear that this set of tools must be on its way to becoming a super-fast, super-cheap visual effects studio in a one-stop box. Sound effects are up for AI development too. The company has another system called Soundify, which accepts a video input, analyses it to work out what it is and what's likely happening within its scenes and then creates audio to match.
Find out more
Smallest Yet VR Headset
California-based VR platform company Bigscreen has moved into hardware with the launch of its "Beyond" headset, which is claimed to be the world's smallest such device (less than an inch at its thinnest point) and lightest (at six times lighter than other VR devices). "As passionate VR enthusiasts, we built the VR headset we would have wanted ourselves", said Bigscreen's founder and CEO, Darshan Shankar. "Today’s leading VR headsets have doubled in weight compared to headsets from 2016, which is much too heavy, bulky, and uncomfortable. We went for increased comfort, and developed ultra-high-end components like OLED micro-displays and pancake optics to increase immersion. To deliver the best software experience for watching movies in Bigscreen, we also had to build the best hardware". The Beyond has two 1-inch micro-OLED displays - each at 2,560 x 2,560 pixels - offering a combined resolution of 5K (5,120 x 2,560), with 7.2-µm-wide pixels, RGB stripe subpixels and "an incredible fill factor" reportedly joining forces to eliminate the so-called screen door effect. Custom pancake optics made up of glass, plastic polymers and films produce a field of view of 93 degrees horizontal and 90 degrees vertical, as well as boasting 28-pixels-per-degree visuals. The Beyond has SteamVR tracking baked in to keep tabs on position and orientation over three dimensions and the headset is compatible with hundreds of VR games and apps on Steam via a cabled Windows PC.. Users can also achieve full-body tracking using such things as HTC Vive or Tundra trackers. The Bigscreen Beyond VR headset is up for pre-order now, starting at US$999 - which includes a 5-m (16-ft) fibre-optic cable and a Link box for connection to a games-optimised PC.
Take a look
Triple Squeeze Pixels For Sharper Displays
Scientists are discovering how to stack sub-pixels to improve MicroLED displays. The sharpness of images on a MicroLED screen is limited by how tightly the pixels that make up the display are packed. MIT scientists have taken a unique approach to packing them much tighter together, using a technique of stacking the pixels' components vertically. On a standard OLED TV (or computer screen) each pixel is actually made up of three OLED sub-pixels - one red, one green, one blue - that are arranged side by side. By illuminating these tiny OLEDs in different combinations, the pixels are able to produce a wide range of colours. On some newer TVs, micro-LEDs serve as the sub-pixels, instead of OLEDs. These MicroLED TVs are claimed to combine the brilliant colours and deep blacks of OLED screens with the brightness of LCD screens. However, micro-LED pixels can't be packed as densely as OLED pixels. This might not matter much on a TV screen, but the lower resolution could be noticeable in devices such as VR headsets. But if each micro-LED pixel was only one sub-pixel wide (instead of three), it would be possible to squeeze three times as many pixels into a given amount of screen space, greatly improving image resolution. The MIT team has done just that, by creating pixels in which the micro-LEDs are stacked vertically, not laid out laterally. Scientists developed a technique which begins with ultra-thin red, green and blue LED membranes being stacked one on top of the other, forming a layer-cake-like arrangement. That cake is then finely sliced in a grid pattern, dividing it up into a multitude of individual pixels – each one is just four microns wide. The team is also working on methods of controlling millions of micro-LED pixels simultaneously, as required in devices such as VR headsets, TVs or computer screens.
Read more
AI Aids ET Search
The chances are incredibly small that Earth is the only planet with life. A new AI system has scoured millions of radio signals from space to identify any with potential artificial origins and discovered eight signals that look intriguingly alien. In the same way, if any actual extra terrestrial aliens were to scan Earth using appropriate technology they would pick up radio chatter and other electromagnetic signals that we have been transmitting for a hundred years. The 18th of October 2022 marked one hundred years since the BBC began public radio broadcasting. With that in mind, the Breakthrough Listen initiative aimed to turn the tables on any life-forms out there and search for artificial radio signals coming from other planets in our galaxy. But the universe is a noisy place: stars, black holes, magnetars, quasars, FRBs, supernovae, gamma ray bursts and a whole range of other objects and events that can also produce radio and other signals. Add to that all the everyday interference from our own technology, like mobile phones and GPS satellites. and the size of the problem is plain to see - or rather, to hear. Tuning out the background noise to find possible alien techno-signatures is a monumental task. Artificial intelligence is adept at sorting through huge amounts of data to look for patterns, meaning this is the perfect job to put it to work on. So for the new study, researchers at the University of Toronto developed a new machine learning algorithm designed to seek out the most promising techno-signature candidates. Out of the 3 million signals in the dataset, the AI identified 20,515 signals of interest. The research team then had to inspect each of these manually, which resulted in just eight of the signals with the right characteristics to be techno-signatures and which could not be written off as interference.
Read the report
SEARCH ENGINE OF THE MONTH
While lots of users, influencers and other commentators have been busy trying out the new chatbot-powered Bing and a rival presentation from search engine Google, known as Bard, a start-up named You.com has been offering the same kind of AI-powered search capabilities since 2020 but remains mostly ignored and unknown. The company claims its YouChat AI bot could soon grow into a serious competitor that might threaten Microsoft and Google’s AI dominance in the future. Meanwhile, recent search engine news headlines have been mostly limited to Microsoft and Google. Microsoft’s unveiling of its new ChatGPT-powered Bing has taken up most of the spotlight. Google's attempt to garner some of the AI attention by presenting its alternative - called Bard - did less well when that particular AI tool’s widely reported wrong answers caused the company’s shares to drop by $100 billion. In the midst of this battle of the giants for AI-assisted search dominance, only a few commentators noticed - or in some cases remembered - that what is claimed to be very much the same AI-enhanced search technology has been available at the search engine known as You.com since as early as December 2022.
Take a look
Zen Internet - Home SalesSales
01706 902573
Zen Internet - Customer EnquiriesCustomer Enquiries
01706 902001