Saturday, July 14, 2018

Quantum Computing For Computer Scientists


Quantum computing is coming, so a lot of people are trying to articulate why we want it and how it works. Most of the explanations are either hardcore physics talking about spin and entanglement, or very breezy and handwaving which can be useful to get a little understanding but isn’t useful for applying the technology. Microsoft Research has a video that attempts to hit that spot in the middle — practical information for people who currently work with traditional computers. You can see the video below.

The video starts with basics you’d get from most videos talking about vector representation and operations. You have to get through about 17 minutes of that sort of thing until you get into qubits. If you glaze over on math, listen to the “index array” explanations [Andrew] gives after the math and you’ll be happier.

Billing the Deutsch Oracle as an example of why quantum computing is superior makes us nervous. The premise is you can identify a black box in one operation as opposed to two in a classical computer. The problem is that to do that, you need to modify the black box to take an extra bit. Well, if I can modify the black box to take an extra bit (in a different way) with a classical computer, I can identify the function in one operation, also. However, it is a good explanation of a fundamental concept — it just doesn’t bear scrutiny that it demonstrates the advantage of quantum.

It might be better, in our opinion, to show how it mirrors parallelism in classical computing. For example, if I can modify my black box to do the same operation on two bits in parallel, I get the same result. The quantum modification — granted — is simpler, but it still required you to add an extra bit and modify the black box.

The video closes with some live demos using the Microsoft tools. If you watch this video and want to do some hands-on in your browser — or even a real machine — you might enjoy our tutorial series. If you are trying to just explain or understand quantum computing at a higher level, the IBM videos are a lot more breezy.
read more "Quantum Computing For Computer Scientists"

Thursday, July 12, 2018

What a New Design Could Mean for Apple Watch Series 4

The Apple Watch Series 4 expected this fall will feature a new design and a display that's 15 percent larger, according to the latest reports from respected Apple analyst Ming-Chi Kuo. Additional details have yet to emerge, but the Series 4 could mark the first design change introduced in the Apple Watch since its debut three years ago.

Kuo expects the two new Apple Watch models will measure 1.57 inches (39.9mm) and 1.78 inches (45.2mm), compared to the current 38mm and 42mm Apple Watch sizes. Rumors remain unclear on whether the casings themselves will be larger or if a reduction in bezel size will allow for larger OLED displays, but that hasn't stopped designers from sharing their own Series 4 concepts.

Rumored new design (left) alongside Series 3 (right)
Venya Geskin created the above mockups envisioning what a larger display could look like on a new Apple Watch while retaining the existing physical dimensions. In this design, the Series 4 screen adopts rounded corners to align with the reduced bezels and maximize the display area, similar to the display on the iPhone X.

Such a design change would significantly enhance user interaction with the Force Touch display, allowing for bigger clock faces and virtual buttons, and more space for viewing text, equating to less scrolling. In addition, watchOS 5 promises to bring rich HTML content to Apple Watch devices, so a larger screen would enhance the experience of navigating websites and interacting with them.

According to separate industry sources, there's an outside chance that the upcoming Apple Watch models could use micro-LED screens, which have the potential to be thinner and lighter, with improved color gamut, increased brightness, and support for higher resolutions. If micro-LED is adopted this year, it could potentially free up space for other hardware improvements.

Here’s exact to-scale drawings of Apple Watch Series 4 based on the rumors. Note the 38mm screen will be bigger than the current 42mm screen! 😱

— Ryan Jones (@rjonesy) July 11, 2018

Kuo believes the next Apple Watch will feature improved health monitoring capabilities, including improved heart rate features. The analyst doesn't explain whether these improvements would be based on software or hardware, but some combination of the two is one possibility.

The heart rate sensor in the current Apple Watch uses rapidly flashing green LED lights paired with light‑sensitive photodiodes to continuously monitor heart rate during workouts and Breathe sessions. However, the device switches to infrared light to measure heart rate in the background at intervals that vary, depending on your activity level.

Unlike some smartwatches, currently it's not possible to configure Apple Watch to monitor your heart rate continuously all of the time. Such a feature in the Series 4 would be a welcome addition, but whether it could be battery-efficient without adopting a different sensor array than the current one is unclear.

That said, Kuo also believes that the Apple Watch will feature a higher-capacity battery for better battery life. The adoption of solid state buttons that don't physically click but instead use a Taptic Engine to provide haptic feedback to users could also potentially provide extra room for a larger battery.

Another possibility Apple is said to be testing is an EKG feature that records the electrical activity of the heart using two fingers on either side of the Apple Watch. This would detect electrical changes in the skin caused by the rhythm of the heartbeat – and potentially identify any irregularities.

Concept design by Benjamin Geskin
Whatever comes with Apple Watch Series 4, we should know for certain in a couple of months. Apple is expected to announce the device at its annual September event, which is typically held during the first or second week of September. What changes are you hoping for in the new Apple Watch? Let us know in the comments below.

Related Roundups: Apple Watch, watchOS 4, watchOS 5
Buyer's Guide: Apple Watch (Caution)

Discuss this article in our forums

MacRumors-Front?d=yIl2AUoC8zA MacRumors-Front?d=6W8y8wAjSf4 MacRumors-Front?d=qj6IDK7rITs
read more "What a New Design Could Mean for Apple Watch Series 4"

Thursday, July 5, 2018

Hard Hat Tour: The Forum

Set to open in September, The Forum is third building to open on Columbia University's Manhattanville Campus. Designed by Renzo Piano Building Workshop, the 56,000-sf building follows the Jerome L. Greene Science Center and the Lenfest Center for the Arts, both of which opened in 2017 and were also designed by RPBW. Last week, Columbia held a press tour of The Forum with architects from RPBW; below are my photos and a tour through the building.

The Forum
The Forum is located on the northeast corner of Broadway and 125th Street, on a triangular lot formed by the angle of 125th Street. At this location the concrete-and-glass Forum acts as a gateway to the Manhattanville Campus.

The Forum
The Forum sits south of the Greene and Lenfest buildings, on the left in the photo above, and just west of the 125th Street subway station, which runs as a viaduct due to the low topography along 125th Street.

The Forum
With the triangular site and position next to the subway, the large auditorium is located behind precast concrete walls on the east, while the meeting rooms, offices, and other spaces are found in the tapered glassy prow on the west.

The Forum
The narrow end, seen from inside the construction fence, is articulated as planes of concrete and glass with exposed structural steel outside the building.

The Forum
The rooms inside the prow are graced with a view of the Riverside Drive Viaduct.

The Forum
As in the other buildings on the Manhattanville Campus, both built and planned, the ground floor is open to the public, set back from the street, and glassy. This stems from a six- or seven-story service basement that will spread across the whole campus once it's completed, thereby removing loading docks and other service pieces from the ground floors of the buildings.

The Forum
The visual transparency of the ground floor is evident here, a future retail/cafe space facing the corner of 125th Street and Broadway.

The Forum
The tapered end of the ground floor, set to be the campus's information center, has more glass, including a canopy -- its framing is just visible to the left of the glass storefront.

The Forum
The main entrance is in the middle of the south-facing frontage on 125th Street. Once inside, visitors will see a security desk straight ahead. From here, the information center is to the left and retail is to the right, while security turnstiles provide restricted access to the elevators (one is just visible on the left) and the two upper floors.

The Forum
The west end of the second and third floor are full of office spaces behind glass walls.

The Forum
Offices such as this one look out to the Greene Science Center across 129th Street.

The Forum
Circulation to the auditorium receives natural light through narrow windows set into the precast concrete panels.

The Forum
The windows at the chamfered southeast corner frame narrow views of the 125th Street viaduct.

The Forum
Although these photos make it seem like The Forum is all whites and grays, there is some selective splashes of color throughout: flooring, casework, elevators, as well as the seats and surfaces in the auditorium. (Protective covers during construction left most of the color out of sight during our visit.) This large space is insulated from the sound of passing trains by masonry walls. Combined with the precast concrete on the exterior, the box within a box construction is sufficient for acoustics, per RPBW.

The Forum
The zone between the concrete east end and glassy west end is turned into a terrace on the third floor; it is positioned two floors above the main entrance on the south frontage of 125th Street.

The Forum
I don't foresee this being a quiet terrace for Columbia employees or conference attendees, though, given the large air handling units positioned directly above the terrace.

The Forum
Following the tour was a reception in Lenfest's top-floor Lantern, which provides a view of The Forum's glassy prow just past the corner of the Greene Science Center.

The Forum
An outdoor terrace on one of Lenfest's lower floors looks on to the construction of the Business School, a pair of buildings designed by Diller Scofidio + Renfro for the block north of 130th Street. In between the buildings will be a large square designed by James Corner Field Operations, and below it will sit a 17-berth loading dock that will serve the entire campus via its below-grade service levels. A sense of the campus's scale of construction is evident in the concrete "bathtub" that forms the massive yet invisible basement that enables the glassy buildings above it.
read more "Hard Hat Tour: The Forum"

Oh Hell Yeah Honda Wants To Top 150 MPH In A Lawnmower


I have been fascinated by the concept of lawnmower racing since high school, when a guy in my grade, who called me “Vroom Vroom” because I liked NASCAR and often yelled “VROOM VROOM!” over everyone’s heads to get my attention in the hallway from many feet away, mentioned it once.

read more "Oh Hell Yeah Honda Wants To Top 150 MPH In A Lawnmower"

Sunday, June 17, 2018

Popular Mac Developer Slams Apple for 'Sad State of Macintosh Hardware'

Rogue Amoeba developer Quentin Carnicelli, who works on Mac software like Airfoil, Audio Hijack, Loopback, and Fission, this week penned a critique of Apple's Mac lineup and the company's recent lack of Mac updates, and that missive has been gaining some attention from Mac fans.

Using MacRumors' own Buyer's Guide, Carnicelli points out that it's been more than a year since any Mac, with the exception of the iMac Pro, has been updated.

It's been 375 days, for example, since the iMac, MacBook, MacBook Pro, and MacBook Air machines were last updated, and it's been 437 days since the Mac Pro saw the price drop Apple implemented as it works on a Mac Pro replacement.

The Mac Pro has not seen a hardware update since December of 2013, more than 1600 days ago. Apple has promised its professional users that a high-end high-throughput modular Mac Pro system is in the works, but we thus far have no details on when it might see a release.

The Mac mini, Apple's most affordable desktop Mac, has gone 1338 days without an update, with the last refresh introduced in October of 2014. While Apple has made promises about a refreshed Mac Pro, no similar statement has been provided about a future Mac mini, aside from a comment from Apple CEO Tim Cook stating that the Mac mini continues to be important to Apple.

According to Carnicelli, the state of the Mac lineup is "deeply worrisome" to him as a person who works for a Mac-based software company. Customers are, he says, forced to choose between "purchasing new computers that are actually years old" or "holding out in the faint hope that hardware updates are still to come."
It's very difficult to recommend much from the current crop of Macs to customers, and that's deeply worrisome to us, as a Mac-based software company. For our own internal needs, we've wound up purchasing used hardware for testing, rather than opting to compromise heavily on a new machine. That isn't good for Apple, nor is it what we want.
As Carnicelli points out, Apple could reassure its Mac users with updates and speed bumps to its Mac lineup on a "much more frequent basis," calling the current lack of updates "baffling and frightening to anyone who depends on the platform for their livelihood."

Apple in 2017 refreshed much of its Mac lineup (iMac, MacBook Pro, MacBook Air, and MacBook) at its Worldwide Developers Conference, but this year, Apple opted to focus instead on software, with no new Mac hardware announced. With no new hardware in June, based on past release history, we could be looking at an 18-month upgrade cycle this time around, as pointed out by iMore's Rene Ritchie, with new Macs making an appearance in September or October.

Some of the blame for Apple's lack of updates can perhaps be placed on its reliance on Intel, and in the past, some Mac refreshes have been pushed back due to delays with Intel chips. This is likely one of the reasons why Apple is planning to transition from Intel chips to its own custom made Mac chips as early as 2020.

MacBook, MacBook Pro, iMac, and MacBook Air upgrades are not in the dire state that Mac Pro and Mac mini upgrades are in, but increased attention on issues with the MacBook and MacBook Pro keyboards has left Apple customers eager to see those machine updated, especially as Apple has not acknowledged these keyboard issues despite their prevalence in the media.

"Apple needs to publicly show their commitment to the full Macintosh hardware line and they need to do it now," writes Carnicelli.

Carnicelli's comments on the state of the Mac lineup came just before Apple released a new Mac advertising campaign. Called "Behind the Mac," the campaign highlights creators who use their Macs to "make something wonderful."

The first ad spots in the series focus on photographer and disability advocate Bruce Hall, who uses his Mac for editing photographs, musician Grimes, who uses the Mac "from start to finish" to write all of her music, edit music videos, and more, and app developer Peter Kariuki who used his Mac to code the SafeMotos app, which is designed to connect passengers with safe motorcycle drivers in Rwanda.

These ads, while inspiring, may be seen as too little too late by those who have grown frustrated with Apple's Mac lineup and have come to see the lack of updates as an indicator of a lack of commitment to the Mac.
Discuss this article in our forums

MacRumors-Front?d=yIl2AUoC8zA MacRumors-Front?d=6W8y8wAjSf4 MacRumors-Front?d=qj6IDK7rITs
read more "Popular Mac Developer Slams Apple for 'Sad State of Macintosh Hardware'"

Chinese Satellite Captures a Cool View of Earth from Lunar Orbit


After a 20-day journey, China’s Queqiao lunar communications relay satellite has made it to lunar orbit. Its companion, the Longjiang-2 microsatellite, wasted no time, taking some neat pics of Earth and the lunar surface.

read more "Chinese Satellite Captures a Cool View of Earth from Lunar Orbit"

After twenty years of Salesforce, what Marc Benioff got right and wrong about the cloud

As we enter the 20th year of Salesforce, there’s an interesting opportunity to reflect back on the change that Marc Benioff created with the software-as-a-service (SaaS) model for enterprise software with his launch of

This model has been validated by the annual revenue stream of SaaS companies, which is fast approaching $100 billion by most estimates, and it will likely continue to transform many slower-moving industries for years to come.

However, for the cornerstone market in IT — large enterprise-software deals — SaaS represents less than 25 percent of total revenue, according to most market estimates. This split is even evident in the most recent high profile “SaaS” acquisition of GitHub by Microsoft, with over 50 percent of GitHub’s revenue coming from the sale of their on-prem offering, GitHub Enterprise.  

Data privacy and security is also becoming a major issue, with Benioff himself even pushing for a U.S. privacy law on par with GDPR in the European Union. While consumer data is often the focus of such discussions, it’s worth remembering that SaaS providers store and process an incredible amount of personal data on behalf of their customers, and the content of that data goes well beyond email addresses for sales leads.

It’s time to reconsider the SaaS model in a modern context, integrating developments of the last nearly two decades so that enterprise software can reach its full potential. More specifically, we need to consider the impact of IaaS and “cloud-native computing” on enterprise software, and how they’re blurring the lines between SaaS and on-premises applications. As the world around enterprise software shifts and the tools for building it advance, do we really need such stark distinctions about what can run where?


The original cloud software thesis

In his book, Behind the Cloud, Benioff lays out four primary reasons for the introduction of the cloud-based SaaS model:

  1. Realigning vendor success with customer success by creating a subscription-based pricing model that grows with each customer’s usage (providing the opportunity to “land and expand”). Previously, software licenses often cost millions of dollars and were paid upfront, each year after which the customer was obligated to pay an additional 20 percent for support fees. This traditional pricing structure created significant financial barriers to adoption and made procurement painful and elongated.
  2. Putting software in the browser to kill the client-server enterprise software delivery experience. Benioff recognized that consumers were increasingly comfortable using websites to accomplish complex tasks. By utilizing the browser, Salesforce avoided the complex local client installation and allowed its software to be accessed anywhere, anytime and on any device.
  3. Sharing the cost of expensive compute resources across multiple customers by leveraging a multi-tenant architecture. This ensured that no individual customer needed to invest in expensive computing hardware required to run a given monolithic application. For context, in 1999 a gigabyte of RAM cost about $1,000 and a TB of disk storage was $30,000. Benioff cited a typical enterprise hardware purchase of $385,000 in order to run Siebel’s CRM product that might serve 200 end-users.
  4. Democratizing the availability of software by removing the installation, maintenance and upgrade challenges. Drawing from his background at Oracle, he cited experiences where it took 6-18 months to complete the installation process. Additionally, upgrades were notorious for their complexity and caused significant downtime for customers. Managing enterprise applications was a very manual process, generally with each IT org becoming the ops team executing a physical run-book for each application they purchased.

These arguments also happen to be, more or less, that same ones made by infrastructure-as-a-service (IaaS) providers such as Amazon Web Services during their early days in the mid-late ‘00s. However, IaaS adds value at a layer deeper than SaaS, providing the raw building blocks rather than the end product. The result of their success in renting cloud computing, storage and network capacity has been many more SaaS applications than ever would have been possible if everybody had to follow the model Salesforce did several years earlier.

Suddenly able to access computing resources by the hour—and free from large upfront capital investments or having to manage complex customer installations—startups forsook software for SaaS in the name of economics, simplicity and much faster user growth.

Source: Getty Images

It’s a different IT world in 2018

Fast-forward to today, and in some ways it’s clear just how prescient Benioff was in pushing the world toward SaaS. Of the four reasons laid out above, Benioff nailed the first two:

  • Subscription is the right pricing model: The subscription pricing model for software has proven to be the most effective way to create customer and vendor success. Years ago already, stalwart products like Microsoft Office and the Adobe Suite  successfully made the switch from the upfront model to thriving subscription businesses. Today, subscription pricing is the norm for many flavors of software and services.
  • Better user experience matters: Software accessed through the browser or thin, native mobile apps (leveraging the same APIs and delivered seamlessly through app stores) have long since become ubiquitous. The consumerization of IT was a real trend, and it has driven the habits from our personal lives into our business lives.

In other areas, however, things today look very different than they did back in 1999. In particular, Benioff’s other two primary reasons for embracing SaaS no longer seem so compelling. Ironically, IaaS economies of scale (especially once Google and Microsoft began competing with AWS in earnest) and software-development practices developed inside those “web scale” companies played major roles in spurring these changes:

  • Computing is now cheap: The cost of compute and storage have been driven down so dramatically that there are limited cost savings in shared resources. Today, a gigabyte of RAM is about $5 and a terabyte of disk storage is about $30 if you buy them directly. Cloud providers give away resources to small users and charge only pennies per hour for standard-sized instances. By comparison, at the same time that Salesforce was founded, Google was running on its first data center—with combined total compute and RAM comparable to that of a single iPhone X. That is not a joke.
  • Installing software is now much easier: The process of installing and upgrading modern software has become automated with the emergence of continuous integration and deployment (CI/CD) and configuration-management tools. With the rapid adoption of containers and microservices, cloud-native infrastructure has become the de facto standard for local development and is becoming the standard for far more reliable, resilient and scalable cloud deployment. Enterprise software packed as a set of Docker containers orchestrated by Kubernetes or Docker Swarm, for example, can be installed pretty much anywhere and be live in minutes.

Sourlce: Getty Images/ERHUI1979

What Benioff didn’t foresee

Several other factors have also emerged in the last few years that beg the question of whether the traditional definition of SaaS can really be the only one going forward. Here, too, there’s irony in the fact that many of the forces pushing software back toward self-hosting and management can be traced directly to the success of SaaS itself, and cloud computing in general:

  1. Cloud computing can now be “private”: Virtual private clouds (VPCs) in the IaaS world allow enterprises to maintain root control of the OS, while outsourcing the physical management of machines to providers like Google, DigitalOcean, Microsoft, Packet or AWS. This allows enterprises (like Capital One) to relinquish hardware management and the headache it often entails, but retain control over networks, software and data. It is also far easier for enterprises to get the necessary assurance for the security posture of Amazon, Microsoft and Google than it is to get the same level of assurance for each of the tens of thousands of possible SaaS vendors in the world.
  2. Regulations can penalize centralized services: One of the underappreciated consequences of Edward Snowden’s leaks, as well as an awakening to the sometimes questionable data-privacy practices of companies like Facebook, is an uptick in governments and enterprises trying to protect themselves and their citizens from prying eyes. Using applications hosted in another country or managed by a third party exposes enterprises to a litany of legal issues. The European Union’s GDPR law, for example, exposes SaaS companies to more potential liability with each piece of EU-citizen data they store, and puts enterprises on the hook for how their SaaS providers manage data.
  3. Data breach exposure is higher than ever: A corollary to the point above is the increased exposure to cybercrime that companies face as they build out their SaaS footprints. All it takes is one employee at a SaaS provider clicking on the wrong link or installing the wrong Chrome extension to expose that provider’s customers’ data to criminals. If the average large enterprise uses 1,000+ SaaS applications and each of those vendors averages 250 employees, that’s an additional 250,000 possible points of entry for an attacker.
  4. Applications are much more portable: The SaaS revolution has resulted in software vendors developing their applications to be cloud-first, but they’re now building those applications using technologies (such as containers) that can help replicate the deployment of those applications onto any infrastructure. This shift to what’s called cloud-native computing means that the same complex applications you can sign up to use in a multi-tenant cloud environment can also be deployed into a private data center or VPC much easier than previously possible. Companies like BigID, StackRox, Dashbase and others are taking a private cloud-native instance first approach to their application offerings. Meanwhile SaaS stalwarts like Atlassian, Box, Github and many others are transitioning over to Kubernetes driven, cloud-native architectures that provide this optionality in the future.  
  5. The script got flipped on CIOs: Individuals and small teams within large companies now drive software adoption by selecting the tools (e.g., GitHub, Slack, HipChat, Dropbox), often SaaS, that best meet their needs. Once they learn what’s being used and how it’s working, CIOs are faced with the decision to either restrict network access to shadow IT or pursue an enterprise license—or the nearest thing to one—for those services. This trend has been so impactful that it spawned an entirely new category called cloud access security brokers—another vendor that needs to be paid, an additional layer of complexity, and another avenue for potential problems. Managing local versions of these applications brings control back to the CIO and CISO.

Source: Getty Images/MIKIEKWOODS

The future of software is location agnostic

As the pace of technological disruption picks up, the previous generation of SaaS companies is facing a future similar to the legacy software providers they once displaced. From mainframes up through cloud-native (and even serverless) computing, the goal for CIOs has always been to strike the right balance between cost, capabilities, control and flexibility. Cloud-native computing, which encompasses a wide variety of IT facets and often emphasizes open source software, is poised to deliver on these benefits in a manner that can adapt to new trends as they emerge.

The problem for many of today’s largest SaaS vendors is that they were founded and scaled out during the pre-cloud-native era, meaning they’re burdened by some serious technical and cultural debt. If they fail to make the necessary transition, they’ll be disrupted by a new generation of SaaS companies (and possibly traditional software vendors) that are agnostic toward where their applications are deployed and who applies the pre-built automation that simplifies management. This next generation of vendors will more control in the hands of end customers (who crave control), while maintaining what vendors have come to love about cloud-native development and cloud-based resources.

So, yes, Marc Benioff and Salesforce were absolutely right to champion the “No Software” movement over the past two decades, because the model of enterprise software they targeted needed to be destroyed. In the process, however, Salesforce helped spur a cloud computing movement that would eventually rewrite the rules on enterprise IT and, now, SaaS itself.

Techcrunch?d=2mJPEYqXBVI Techcrunch?d=7Q72WNTAKBA Techcrunch?d=yIl2AUoC8zA Techcrunch?i=57BcNIaWLaM:C3O-PEnxzgM:-BT Techcrunch?i=57BcNIaWLaM:C3O-PEnxzgM:D7D Techcrunch?d=qj6IDK7rITs
read more "After twenty years of Salesforce, what Marc Benioff got right and wrong about the cloud"

Jaguar breaks the world's electric boat speed record

jaguar-vector.jpgYou frequently see car manufacturers trying to break electric speed records on land, but where are the boats? Don't worry -- Jaguar, Vector and Williams feel the need for nautical speed. The trio have broken both the world and UK speed records with...
read more "Jaguar breaks the world's electric boat speed record"

How Alonso Had An Enormous Impact On Toyota's First Le Mans Win

It'd be easy to think that Alonso piggybacked on the experience of his teammates and the lack of competition to secure Le Mans victory, but that couldn't be further from the truth

read more "How Alonso Had An Enormous Impact On Toyota's First Le Mans Win"

Alonso, along with Nakajima & Buemi, win the Le Mans 24 Hours

Fernando Alonso, together with teammates Sebastien Buemi & Kazuki Nakajima, has won the Le Mans 24 Hours.

Alonso signed up to take part in the World Endurance Championship this season, racing for the Toyota factory LMP1 team at events that don’t clash with his Formula 1 schedule.

Le Mans, the iconic 24 hour endurance race at the Circuit de la Sarthe in France, was one of those events. Following on from his retirement from the Canadian Grand Prix last weekend, Alonso hopped on a plane to Le Mans to join his teammates and prepare for the race.

The two factory Toyotas were the only outright clear contenders for the outright victory in the LMP1 class, with Alonso, Kazuki Nakajima & Sebastien Buemi driving the #8 car, with Kamui Kobayashi, Mike Conway, Jose Maria Lopez driving the #7 car. The two TS050s spent a large proportion of the race locked in their own private battle, with the #7 car enjoying a healthy lead at the one third mark.

However, the momentum started to shift in the early hours of Sunday morning as Alonso put in a very strong shift to catch Lopez at a vast rate, with Nakajima following suit when it was his turn at the wheel. Nakajima would go on to pass Kobayashi in the #7, before slowing easing the gap out. Penalties for the #7 followed, with Kobyayashi missing his required scheduled driver swap and completing too many laps in one stint. After 388 laps of racing, Nakajima crossed the line to win for Toyota and his teammates.

This marks Toyota’s first win at Le Mans, coming two years after their astonishing defeat two years ago when Kobayashi broke down in the final ten minutes with the chequered flag almost in sight.

With Alonso part of the winning crew of Le Man’s premier class, he adds to his motorsport accomplishments. The so-called ‘Triple Crown of Motorsport’ consists of Formula 1’s Monaco Grand Prix, the Le Mans 24 Hours & the Indianapolis 500. Having won at Monaco in F1 and now at Le Mans, Alonso only needs to win the Indy 500 to become the first driver in 50 years to do so, with Graham Hill the only man to have achieved it, and the first contemporary driver.

“It is an amazing feeling, I am still in shock,” he said. “I was stressed at the end – I am not used to watching my own car racing. It was a tense 24 hours with the two cars being within a minute the whole race. I am trying to enjoy every second of this moment.”

Alonso’s F1 future is uncertain, with no contract beyond this season with McLaren. It is believed that Alonso has grown disillusioned with the midfield and lack of competitiveness of his team in F1, and that he is pushing for a return to the Indy 500 after taking part in last year’s race. While he retired with a blown engine on that occasion, Alonso had been a strong contender for the win throughout.

Former F1 teammate Jenson Button was also taking part in the Le Mans race with privateer team SMP Racing, but retired with an engine failure following a lengthy stay in the pits.

Enorme la victoria de todo el equipo en la carrera más grande del mundo. Las 24h de Le Mans. Gracias 💙

— Fernando Alonso (@alo_oficial) June 17, 2018

The post Alonso, along with Nakajima & Buemi, win the Le Mans 24 Hours appeared first on
read more "Alonso, along with Nakajima & Buemi, win the Le Mans 24 Hours"
Related Posts Plugin for WordPress, Blogger...