Lab Scans And Analyzes Sting’s Brain As Part Of Study Of Mental Music Organization

first_imgFollowing a recent show in Montreal, Sting met with Daniel Levitin, a cognitive psychologist at McGill University to have fMRI images of his brain taken as part of an ongoing study of how the brain of a skilled musician analyzes and organizes music. In a paper outlining the study published on McGill’s website, Levitin explains that he and his partners have developed imaging-analysis techniques to provide insight into how gifted individuals find connections between seemingly disparate thoughts or sounds, in fields ranging from arts, to politics, to science. “These state-of-the-art techniques really allowed us to make maps of how Sting’s brain organizes music. That’s important because at the heart of great musicianship is the ability to manipulate in one’s mind rich representations of the desired soundscape.”The research came about as a result of a mutual admiration between Sting and the McGill psychologist. Years ago, Sting read Levitin’s book This Is Your Brain On Music, and asked to meet Levitin and take a tour of his facilities, as many musicians have done over the years. While there, Levitin asked if Sting would be interested in having his brain scanned, and the musician obliged.Both functional and structural scans were conducted in a single session at the brain imaging unit of McGill’s Montreal Neurological Institute on the hot afternoon of his July 5th concert with Peter Gabriel at the Bell Centre (part of their current Rock Paper Scissors Tour). A power outage knocked the entire campus off-line for several hours, threatening to cancel the experiment. Because it takes over an hour to reboot the fMRI machine, time grew short. Sting graciously agreed to skip his soundcheck in order to accurately complete the scan.Levitin then teamed up with Scott Grafton, a leading brain-scan expert at the University of California at Santa Barbara, to use two novel techniques to analyze the scans. The techniques showed which songs Sting found similar to one another and which ones he found dissimilar based not on tests or questionnaires, but on activations of brain regions. Says Grafton, “At the heart of these methods is the ability to test if patterns of brain activity are more alike for two similar styles of music compared to different styles. This approach has never before been considered in brain imaging experiments of music.”According to Levitin, “Sting’s brain scan pointed us to several connections between pieces of music that I know well but had never seen as related before.” The most surprising neural connection was the similarity in brain activity between Piazzolla’s “Libertango”, a tango composition, and the Beatles‘ “Girl” off 1965’s Rubber Soul. While the songs differ greatly in sound and genre, both pieces are in minor keys and include similar melodic motifs. Another example of similar neurological responses to seemingly different songs was Sting’s own “Moon Over Bourbon Street” and Booker T. and the MG‘s “Green Onions”, both of which have the same 132 bpm tempo and a swinging rhythm. While more information is needed to draw any scientific conclusions, these tests provide insight into the connecting factors between different kinds of music in terms of how they are received and processed by the mind of a musician.[via McGill University]last_img read more

Accelerating Storage Innovation in the Next Data Decade

first_imgOver the previous decade, technology transformed nearly every business into an IT-driven business. From farming to pharmaceuticals, these information technology developments have led organizations to reimagine how they operate, compete, and serve customers. Data is at the heart of these changes and will continue its transformative trajectory as organizations navigate the waves of technological progress in the next “Data Decade.”In data storage – which touches every IT-driven business – the pace of innovation is accelerating, yet most enterprises continue to struggle with data’s explosive growth and velocity. Getting the highest use and value from their data is becoming ever more critical for organizations, especially for those with data stores reaching exabyte scale.In order to have strategic value in the enterprise, storage innovation must cross the capabilities chasm from just storing and moving around bits to holistic data management.In 2019, our Dell Technologies Storage CTO Council studied more than 90 key technologies and ranked which ones have the innovation potential to help storage cross that capabilities chasm in the next 5-10 years. This year, there are three key areas we believe will be difference-makers for organizations that are pushing the limits of current storage and IT approaches.Let’s take a closer look.Trend #1: Machine learning and CPU Performance unlock new storage and data management approachesThis year, we will see new approaches that solve streaming data challenges, including the use of container-based architectures and software-defined storage. There is a desire by customers in industries such as manufacturing, cybersecurity, autonomous vehicles, public safety and healthcare to build applications that treat data as streams instead of breaking it up into separate files or objects.Ingesting and processing stream data has unique challenges that limit traditional IT and storage systems. Since streaming workloads often change throughout the day – storage capacity and compute power must be elastic to accommodate. This requires intelligence within the storage that can instantly provide autoscaling.By treating everything as a data stream, event data can be replayed in the same way we watch a live sporting event on a DVR-enabled TV, where the program can be paused, rewound and replayed instantly. Until now, application developers have been limited in their ability to address use cases that can leverage data as streams for capture, playback and archive. Enabling these capabilities with data will make it easier to build applications that allow new use cases that were never thought of previously.Dataset Management helps solve the data lifecycle problemIn the realm of data management, 2020 will usher in new approaches for organizations wishing to better manage the data that is distributed across many silos of on-prem and cloud data stores. Data growth has been outstripping the growth of IT budgets for years, making it difficult for organizations not only to keep and store all their data, but manage, monetize, secure and make it useful for end users.Enter Dataset Management – an evolving discipline using various approaches and technologies to help organizations better use and manage data through its lifecycle. At its core, it is about the ability to store data transparently and make it easily discoverable. Our industry has been very good at storing block, file and object data, sometimes unifying these data in a data lake. Dataset Management is the evolution of a data lake, providing customers with the ability to instantly find the data they want and make it actionable in proper context across on-prem and cloud-based data stores.Dataset Management will be especially useful for industries (i.e. media & entertainment, healthcare, insurance) that frequently have data stored across different storage systems and platforms (i.e. device/instrument generated raw data, to derivative data at a project level, etc.). Customers want the ability to search across these data stores to do things such as creating custom workflows. For instance, many of our largest media & entertainment customers are using Dataset Management to connect with asset management databases to tag datasets, which can then be moved to the correct datacenters for things such as special effects work or digital postprocessing, then to distribution and finally to archives.Traditional methods for managing unstructured data only takes you so far. Because of new technological advancements like machine learning and higher CPU performance, we see Dataset Management growing further in prominence in 2020, as it offers organizations a bridge from the old world of directories and files to the new world of data and metadata.Trend #2: Storage will be architected and consumed as Software-definedWe can expect to see new storage designs in 2020 that will further blur the line between storage and compute.Some of our customers tell us they are looking for more flexibility in their traditional SANs, wishing to have compute as close to storage as possible to support data-centric workloads and to reduce operational complexity.With deeper integration of virtualization technologies on the storage array, apps can be run directly on the same system and managed with standard tools. This could be suitable for data-centric applications that require very storage- and data-intensive operations (i.e. analytics apps, intense database apps). Also, workloads that require quick transactional latency and a lot of data.This isn’t HCI in the classic sense, but rather about leveraging and interoperating with existing infrastructure and processes while also giving a greater degree of deployment flexibility to suit the customer’s specific environment and/or application. It could open up new use cases (i.e. AI ML/analytics at edge locations and/or private cloud, workload domains, etc.); it could also lead to lower cost of ownership and simplification for IT teams and application owners that don’t always have to rely on a storage admin to provision or manage the underlying storage.Software-defined Infrastructure no longer just for hyper-scalersSoftware-defined infrastructure (SDI) is also becoming a greater consideration in enterprise data centers to augment traditional SANs and HCI deployments. Long the realm of hyper-scalers, traditional enterprises are ready to adopt SDI for the redeployment of certain workloads that have different requirements for capacity and compute than what traditional 3-layer SANs can provide.These are customers architecting for agility at scale and want the flexibility of rapidly scaling storage and compute independently of each other. It’s for the customer that needs to consolidate multiple high performance (e.g. database) or general workloads. As enterprises consider consolidation strategies, they will bump up against the limits of traditional SANs and the unpredictable performance/costs and lock-in of cloud services. This is where SDI becomes a very viable alternative to traditional SANs and HCI for certain workloads.Trend #3: High-performance Object storage enters the mainstreamAs Object moves from cheap and deep, cold storage or archive to a modern cloud-native storage platform, performance is on many people’s minds.One of the reasons we see this trending upward this year is demand for it by application developers. Analytics is also driving a lot of demand and we expect to see companies in different verticals moving in this direction.In turn, the added performance of flash and NVMe are creating tremendous opportunity for Object-based platforms to support things that require speed and near-limitless scale (i.e. analytics, Advanced Driver Assistance Systems (ADAS), IoT, cloud-native app development, etc.). Side note: historically, Object storage hasn’t been fast enough for ADAS workloads, but all-flash is changing that conversation.Flash-based Object storage with automated tiering to disk offers a cost-effective solution, particularly when a customer is talking about hundreds of petabytes or exabyte-scale. It allows you to move the data you need up to the flash tier to run your analytics and high-performance applications and then move the data off to a cold or archive tier when you’re done with it.As Object becomes tuned for flash and NVMe, we expect a higher level of interest in Object for things that have traditionally been stored on file-based NAS, such as images, log data, and machine generated data.As the pace of technology innovation accelerates, so too will the possibilities in storage and data management. We are standing with our customers at the dawn of the “Data Decade.”If the last ten years brought some of the most dramatic changes in tech, just imagine what’s next.Read what other Dell Technologies experts are saying about key technology trends in 2020 and beyond by clicking on the blog links below:“Technical Disruptions Emerging in 2020,” by John Roese, CTO, Products & Operations, Dell Technologies“Dell EMC’s 2020 Server Trends & Observations,” by Robert Hormuth, Vice President & Fellow, Chief Technology Officer, Server & Infrastructure Systems, Dell EMC“Dell 2020 Networking & Solutions Technology Trends,” by Ihab Tarazi, Chief Technology Officer and Senior Vice President, Networking and Solutions, Dell Technologieslast_img read more

Notre Dame to announce fall semester plans mid-June

first_imgWith other universities making announcements regarding the 2020 fall semester, the Office of the President said in an email to faculty that while there are no definite plans at this time, an update will be provided by mid-June.The University has created several groups which will work together to plan for the reopening of campus, the email signed by University President Fr. John Jenkins, Provost Thomas Burish and provost-elect Marie Lynn Miranda said. The Academic Year Continuity Working Group will consider alternative approaches for the start of the academic year. “Given the uncertainty about future conditions, the Working Group members are developing plans that maximize flexibility, considering factors such as the start date of the academic year, modes of delivery of instruction, and options for making changes during the course of the year as circumstances change,” the email said.According to the email, in “crafting a response to the disruption wrought by the current crisis, we will be guided by our central University goals, found here, and some of the principles they imply.”The Research Task Force, headed by Bob Bernhard, vice president for research, will create plans for reopening labs, libraries and studios.To determine the steps necessary to bring the Notre Dame community back to campus, the Working Group will consider advice from experts in medicine, public health and epidemiology. This may include extensive diagnostics and immunity testing, contact tracing and quarantining students as necessary, the email said.A Faculty Advisory Committee will also help evaluate plans and offer recommendations which will be relayed to University President Fr. John Jenkins.As information regarding the virus and its transmission continue to change, the email said the predictions are still uncertain.“We can take encouragement from the devotion and incredible work being done by health care providers and scientific researchers worldwide — including here on Notre Dame’s campus,” the email said. “Nevertheless, at present, we cannot be sure when and if drugs will be developed to treat those with the virus, when tests for the virus and antibodies will be widely available, or when an effective vaccine will be found.”Tags: COVID-19, in-person classes, John Jenkins, Thomas Burishlast_img read more

QPR back to prepare for Fulham clash

first_imgThe QPR squad have arrived back in England following a five-day visit to Portugal.Manager Mark Hughes was keen to take his players for a break in the sun ahead of the all-important run-in to the season.Rangers, who are two places above the relegation zone, face Hughes’ former club Fulham in a west London derby at Loftus Road on Saturday.AdChoices广告Recent signing Samba Diakite, who met his new team-mates during the trip, is expected to make his R’s debut.Follow West London Sport on TwitterFind us on Facebooklast_img read more

RISQ recruits former mylotto MD Blerina Essen as corporate advisor

first_img Amelco inks new agreement with RISQ to deliver new trading functions September 16, 2019 Tom Mitchell: RISQ – Helping sportsbook platforms to deliver guaranteed margin solutions to clients October 17, 2019 Submit Share Related Articles Share TAG Media launches US-Odds.com as first ‘US FTP property’ December 10, 2019 Tom Mitchell – RISQRISQ a specialist provider of industry insurance and risk management services has confirmed the appointment of Blerina Essen, the former managing mylotto24 Ltd as a corporate advisor.Updating the market, RISQ leadership details that Essen has been appointed to help optimise the firm’s strategy and planning with regards to its reinsurance and hedging verticals.Regarded as an expert at the forefront of industry risk management strategy, Essen will be working across RISQ divisions helping improve hedging and reinsurance structures of the ‘RISQ Insurance Platform’ which enables partners to enhance campaigns or products with jackpot payouts backed by RISQ insurance provisions.Tom Mitchell Chief Commercial Officer at RISQ, said: “I am delighted to welcome Blerina to the team and to leverage her unrivalled experience and knowledge to improve our lottery insurance offering. It is important to us that our backend reinsurance structures are as robust as possible and in the event of a large win will payout quickly and smoothly.“As our business grows and volumes increase, our backend structures become increasingly important. Blerina will be a driving force behind these improvements to make our offering as flexible as possible and grow the use of jackpots in the global gaming industry.”Leading mylotto24 risk management services over a ten-year period, Essen spearheaded the development of international secondary lotteries for the UK, Ireland, South Africa and Australia, delivering a €250 million unit turnover for the business.Blerina Essen, risk and strategy consultant at RISQ, said: “It is an honour to join RISQ in a consulting capacity at a time when the business is looking to achieve significant growth and to cement its position as a world-class alternative risk trading provider to the global gambling industry.” StumbleUponlast_img read more