Everything Apple

Saturday, 30 June 2018

Benchmark’s Mitch Lasky will reportedly step down from Snap’s board of directors

Benchmark partner Mitch Lasky, who has served on Snap’s board of directors since December 2012, is not expected to stand for re-election to Snap’s board of directors and will thus be stepping down, according to a report by The Information.

Early investors stepping down from the board of directors — or at least not seeking re-election — isn’t that uncommon as once-private companies grow into larger public ones. Benchmark partner Peter Fenton did not seek re-election for Twitter’s board of directors in April last year. As Snap continues to navigate its future, especially as it has declined precipitously since going public and now sits at a valuation of around $16.5 billion. Partners with an expertise in the early-stage and later-stage startup life cycle may end up seeing themselves more useful taking a back seat and focusing on other investments. The voting process for board member re-election happens during the company’s annual meeting, so we’ll get more information when an additional proxy filing comes out ahead of the meeting later this year.

Benchmark is, or at least was at the time of going public last year, one of Snap’s biggest shareholders. According to the company’s 424B filing prior to going public in March last year, Benchmark held ownership of 23.1% of Snap’s Class B common stock and 8.2% of Snap’s Class A common stock. Lasky has been with Benchmark since April 2007, and also serves on the boards of a number of gaming companies like Riot Games and thatgamecompany, the creators of PlayStation titles flower and Journey. At the time, Snap said in its filing that Lasky was “qualified to serve as a member of our board of directors due to his extensive experience with social media and technology companies, as well as his experience as a venture capitalist investing in technology companies.”

The timing could be totally coincidental, but an earlier Recode report suggested Lasky had been talking about stepping down in future funds for Benchmark. The firm only recently wrapped up a very public battle with Uber, which ended up with Benchmark selling a significant stake in the company and a new CEO coming in to replace co-founder Travis Kalanick. Benchmark hired its first female general partner, Sarah Tavel, earlier this year.

We’ve reached out to both Snap and a representative from Benchmark for comment and will update the story when we hear back.

Friday, 29 June 2018

Apple is rebuilding Maps from the ground up

I’m not sure if you’re aware, but the launch of Apple Maps went poorly. After a rough first impression, an apology from the CEO, several years of patching holes with data partnerships and some glimmers of light with long-awaited transit directions and improvements in business, parking and place data, Apple Maps is still not where it needs to be to be considered a world class service.

Maps needs fixing.

Apple, it turns out, is aware of this, so It’s re-building the maps part of Maps.

It’s doing this by using first-party data gathered by iPhones with a privacy-first methodology and its own fleet of cars packed with sensors and cameras. The new product will launch in San Francisco and the Bay Area with the next iOS 12 Beta and will cover Northern California by fall.

Every version of iOS will get the updated maps eventually and they will be more responsive to changes in roadways and construction, more visually rich depending on the specific context they’re viewed in and feature more detailed ground cover, foliage, pools, pedestrian pathways and more.

This is nothing less than a full re-set of Maps and it’s been 4 years in the making, which is when Apple began to develop its new data gathering systems. Eventually, Apple will no longer rely on third-party data to provide the basis for its maps, which has been one of its major pitfalls from the beginning.

“Since we introduced this six years ago — we won’t rehash all the issues we’ve had when we introduced it — we’ve done a huge investment in getting the map up to par,” says Apple SVP Eddy Cue, who now owns Maps in an interview last week.  “When we launched, a lot of it was all about directions and getting to a certain place. Finding the place and getting directions to that place. We’ve done a huge investment of making millions of changes, adding millions of locations, updating the map and changing the map more frequently. All of those things over the past six years.”

But, Cue says, Apple has room to improve on the quality of Maps, something that most users would agree on, even with recent advancements.

“We wanted to take this to the next level,” says Cue. “We have been working on trying to create what we hope is going to be the best map app in the world, taking it to the next step. That is building all of our own map data from the ground up.”

In addition to Cue, I spoke to Apple VP Patrice Gautier and over a dozen Apple Maps team members at its mapping headquarters in California this week about its efforts to re-build Maps, and to do it in a way that aligned with Apple’s very public stance on user privacy.

If, like me, you’re wondering whether Apple thought of building its own maps from scratch before it launched Maps, the answer is yes. At the time, there was a choice to be made about whether or not it wanted to be in the business of Maps at all. Given that the future of mobile devices was becoming very clear, it knew that mapping would be at the core of nearly every aspect of its devices from photos to directions to location services provided to apps. Decision made, Apple plowed ahead, building a product that relied on a patchwork of data from partners like TomTom, OpenStreetMap and other geo data brokers. The result was underwhelming.

Almost immediately after Apple launched Maps, it realized that it was going to need help and it signed on a bunch of additional data providers to fill the gaps in location, base map, point-of-interest and business data.

It wasn’t enough.

“We decided to do this just over four years ago. We said, “Where do we want to take Maps? What are the things that we want to do in Maps? We realized that, given what we wanted to do and where we wanted to take it, we needed to do this ourselves,” says Cue.

Because Maps are so core to so many functions, success wasn’t tied to just one function. Maps needed to be great at transit, driving and walking — but also as a utility used by apps for location services and other functions.

Cue says that Apple needed to own all of the data that goes into making a map, and to control it from a quality as well as a privacy perspective.

There’s also the matter of corrections, updates and changes entering a long loop of submission to validation to update when you’re dealing with external partners. The Maps team would have to be able to correct roads, pathways and other updating features in days or less, not months. Not to mention the potential competitive advantages it could gain from building and updating traffic data from hundreds of millions of iPhones, rather than relying on partner data.

Cue points to the proliferation of devices running iOS, now numbering in the millions, as a deciding factor to shift its process.

“We felt like because the shift to devices had happened — building a map today in the way that we were traditionally doing it, the way that it was being done — we could improve things significantly, and improve them in different ways,” he says. “One is more accuracy. Two is being able to update the map faster based on the data and the things that we’re seeing, as opposed to driving again or getting the information where the customer’s proactively telling us. What if we could actually see it before all of those things?”

I query him on the rapidity of Maps updates, and whether this new map philosophy means faster changes for users.

“The truth is that Maps needs to be [updated more], and even are today,” says Cue. “We’ll be doing this even more with our new maps, [with] the ability to change the map real-time and often. We do that every day today. This is expanding us to allow us to do it across everything in the map. Today, there’s certain things that take longer to change.

“For example, a road network is something that takes a much longer time to change currently. In the new map infrastructure, we can change that relatively quickly. If a new road opens up, immediately we can see that and make that change very, very quickly around it. It’s much, much more rapid to do changes in the new map environment.”

So a new effort was created to begin generating its own base maps, the very lowest building block of any really good mapping system. After that, Apple would begin layering on living location data, high resolution satellite imagery and brand new intensely high resolution image data gathered from its ground cars until it had what it felt was a ‘best in class’ mapping product.

There is only really one big company on earth who owns an entire map stack from the ground up: Google.

Apple knew it needed to be the other one. Enter the vans.

Apple vans spotted

Though the overall project started earlier, the first glimpse most folks had of Apple’s renewed efforts to build the best Maps product was the vans that started appearing on the roads in 2015 with ‘Apple Maps’ signs on the side. Capped with sensors and cameras, these vans popped up in various cities and sparked rampant discussion and speculation.

The new Apple Maps will be the first time the data collected by these vans is actually used to construct and inform its maps. This is their coming out party.

Some people have commented that Apple’s rigs look more robust than the simple GPS + Camera arrangements on other mapping vehicles — going so far as to say they look more along the lines of something that could be used in autonomous vehicle training.

Apple isn’t commenting on autonomous vehicles, but there’s a reason the arrays look more advanced: they are.

Earlier this week I took a ride in one of the vans as it ran a sample route to gather the kind of data that would go into building the new maps. Here’s what’s inside.

In addition to a beefed up GPS rig on the roof, four LiDAR arrays mounted at the corners and 8 cameras shooting overlapping high-resolution images – there’s also the standard physical measuring tool attached to a rear wheel that allows for precise tracking of distance and image capture. In the rear there is a surprising lack of bulky equipment. Instead, it’s a straightforward Mac Pro bolted to the floor, attached to an array of solid state drives for storage. A single USB cable routes up to the dashboard where the actual mapping capture software runs on an iPad.

While mapping, a driver…drives, while an operator takes care of the route, ensuring that a coverage area that has been assigned is fully driven and monitoring image capture. Each drive captures thousands of images as well as a full point cloud (a 3D map of space defined by dots that represent surfaces) and GPS data. I later got to view the raw data presented in 3D and it absolutely looks like the quality of data you would need to begin training autonomous vehicles.

More on why Apple needs this level of data detail later.

When the images and data are captured, they are then encrypted on the fly immediately and recorded on to the SSDs. Once full, the SSDs are pulled out, replaced and packed into a case which is delivered to Apple’s data center where a suite of software eliminates private information like faces, license plates and other info from the images. From the moment of capture to the moment they’re sanitized, they are encrypted with one key in the van and the other key in the data center. Technicians and software that are part of its mapping efforts down the pipeline from there never see unsanitized data.

This is just one element of Apple’s focus on the privacy of the data it is utilizing in New Maps.

Probe data and Privacy

Throughout every conversation I have with any member of the team throughout the day, privacy is brought up, emphasized. This is obviously by design as it wants to impress upon me as a journalist that it’s taking this very seriously indeed, but it doesn’t change the fact that it’s evidently built in from the ground up and I could not find a false note in any of the technical claims or the conversations I had.

Indeed, from the data security folks to the people whose job it is to actually make the maps work well, the constant refrain is that Apple does not feel that it is being held back in any way by not hoovering every piece of customer-rich data it can, storing and parsing it.

The consistent message is that the team feels it can deliver a high quality navigation, location and mapping product without the directly personal data used by other platforms.

“We specifically don’t collect data, even from point A to point B,” notes Cue. “We collect data — when we do it —in an anonymous fashion, in subsections of the whole, so we couldn’t even say that there is a person that went from point A to point B. We’re collecting the segments of it. As you can imagine, that’s always been a key part of doing this. Honestly, we don’t think it buys us anything [to collect more]. We’re not losing any features or capabilities by doing this.”

The segments that he is referring to are sliced out of any given person’s navigation session. Neither the beginning or the end of any trip is ever transmitted to Apple. Rotating identifiers, not personal information, are assigned to any data or requests sent to Apple and it augments the ‘ground truth’ data provided by its own mapping vehicles with this ‘probe data’ sent back from iPhones.

Because only random segments of any person’s drive is ever sent and that data is completely anonymized, there is never a way to tell if any trip was ever a single individual. The local system signs the IDs and only it knows who that ID refers to. Apple is working very hard here to not know anything about its users. This kind of privacy can’t be added on at the end, it has to be woven in at the ground level.

Because Apple’s business model does not rely on it serving, say, an ad for a Chevron on your route to you, it doesn’t need to even tie advertising identifiers to users.

Any personalization or Siri requests are all handled on-board by the iOS device’s processor. So if you get a drive notification that tells you it’s time to leave for your commute, that’s learned, remembered and delivered locally, not from Apple’s servers.

That’s not new, but it’s important to note given the new thing to take away here: Apple is flipping on the power of having millions of iPhones passively and actively improving their mapping data in real time.

In short: traffic, real-time road conditions, road systems, new construction and changes in pedestrian walkways are about to get a lot better in Apple Maps.

The secret sauce here is what Apple calls probe data. Essentially little slices of vector data that represent direction and speed transmitted back to Apple completely anonymized with no way to tie it to a specific user or even any given trip. It’s reaching in and sipping a tiny amount of data from millions of users instead, giving it a holistic, real-time picture without compromising user privacy.

If you’re driving, walking or cycling, your iPhone can already tell this. Now if it knows you’re driving it can also send relevant traffic and routing data in these anonymous slivers to improve the entire service. This only happens if your maps app has been active, say you check the map, look for directions etc. If you’re actively using your GPS for walking or driving, then the updates are more precise and can help with walking improvements like charting new pedestrian paths through parks — building out the map’s overall quality.

All of this, of course, is governed by whether you opted into location services and can be toggled off using the maps location toggle in the Privacy section of settings.

Apple says that this will have a near zero effect on battery life or data usage, because you’re already using the ‘maps’ features when any probe data is shared and it’s a fraction of what power is being drawn by those activities.

From the point cloud on up

But maps cannot live on ground truth and mobile data alone. Apple is also gathering new high resolution satellite data to combine with its ground truth data for a solid base map. It’s then layering satellite imagery on top of that to better determine foliage, pathways, sports facilities, building shapes and pathways.

After the downstream data has been cleaned up of license plates and faces, it gets run through a bunch of computer vision programming to pull out addresses, street signs and other points of interest. These are cross referenced to publicly available data like addresses held by the city and new construction of neighborhoods or roadways that comes from city planning departments.

But one of the special sauce bits that Apple is adding to the mix of mapping tools is a full on point cloud that maps the world around the mapping van in 3D. This allows them all kinds of opportunities to better understand what items are street signs (retro-reflective rectangular object about 15 feet off the ground? Probably a street sign) or stop signs or speed limit signs.

It seems like it could also enable positioning of navigation arrows in 3D space for AR navigation, but Apple declined to comment on ‘any future plans’ for such things.

Apple also uses semantic segmentation and Deep Lambertian Networks to analyze the point cloud coupled with the image data captured by the car and from high-resolution satellites in sync. This allows 3D identification of objects, signs, lanes of traffic and buildings and separation into categories that can be highlighted for easy discovery.

The coupling of high resolution image data from car and satellite, plus a 3D point cloud results in Apple now being able to produce full orthogonal reconstructions of city streets with textures in place. This is massively higher resolution and easier to see, visually. And it’s synchronized with the ‘panoramic’ images from the car, the satellite view and the raw data. These techniques are used in self driving applications because they provide a really holistic view of what’s going on around the car. But the ortho view can do even more for human viewers of the data by allowing them to ‘see’ through brush or tree cover that would normally obscure roads, buildings and addresses.

This is hugely important when it comes to the next step in Apple’s battle for supremely accurate and useful Maps: human editors.

Apple has had a team of tool builders working specifically on a toolkit that can be used by human editors to vet and parse data, street by street. The editor’s suite includes tools that allow human editors to assign specific geometries to flyover buildings (think Salesforce tower’s unique ridged dome) that allow them to be instantly recognizable. It lets editors look at real images of street signs shot by the car right next to 3D reconstructions of the scene and computer vision detection of the same signs, instantly recognizing them as accurate or not.

Another tool corrects addresses, letting an editor quickly move an address to the center of a building, determine whether they’re misplaced and shift them around. It also allows for access points to be set, making Apple Maps smarter about the ‘last 50 feet’ of your journey. You’ve made it to the building, but what street is the entrance actually on? And how do you get into the driveway? With a couple of clicks, an editor can make that permanently visible.

“When we take you to a business and that business exists, we think the precision of where we’re taking you to, from being in the right building,” says Cue. “When you look at places like San Francisco or big cities from that standpoint, you have addresses where the address name is a certain street, but really, the entrance in the building is on another street. They’ve done that because they want the better street name. Those are the kinds of things that our new Maps really is going to shine on. We’re going to make sure that we’re taking you to exactly the right place, not a place that might be really close by.”

Water, swimming pools (new to Maps entirely), sporting areas and vegetation are now more prominent and fleshed out thanks to new computer vision and satellite imagery applications. So Apple had to build editing tools for those as well.

Many hundreds of editors will be using these tools, in addition to the thousands of employees Apple already has working on maps, but the tools had to be built first, now that Apple is no longer relying on third parties to vet and correct issues.

And the team also had to build computer vision and machine learning tools that allow it to determine whether there are issues to be found at all.

Anonymous probe data from iPhones, visualized, looks like thousands of dots, ebbing and flowing across a web of streets and walkways, like a luminescent web of color. At first, chaos. Then, patterns emerge. A street opens for business, and nearby vessels pump orange blood into the new artery. A flag is triggered and an editor looks to see if a new road needs a name assigned.

A new intersection is added to the web and an editor is flagged to make sure that the left turn lanes connect correctly across the overlapping layers of directional traffic. This has the added benefit of massively improved lane guidance in the new Apple Maps.

Apple is counting on this combination of human and AI flagging to allow editors to first craft base maps and then also maintain them as the ever changing biomass wreaks havoc on roadways, addresses and the occasional park.

Here there be Helvetica

Apple’s new Maps, like many other digital maps, display vastly differently depending on scale. If you’re zoomed out, you get less detail. If you zoom in, you get more. But Apple has a team of cartographers on staff that work on more cultural, regional and artistic levels to ensure that its Maps are readable, recognizable and useful.

These teams have goals that are at once concrete and a bit out there — in the best traditions of Apple pursuits that intersect the technical with the artistic.

The maps need to be usable, but they also need to fulfill cognitive goals on cultural levels that go beyond what any given user might know they need. For instance, in the US, it is very common to have maps that have a relatively low level of detail even at a medium zoom. In Japan, however, the maps are absolutely packed with details at the same zoom, because that increased information density is what is expected by users.

This is the department of details. They’ve reconstructed replicas of hundreds of actual road signs to make sure that the shield on your navigation screen matches the one you’re seeing on the highway road sign. When it comes to public transport, Apple licensed all of the type faces that you see on your favorite subway systems, like Helvetica for NYC. And the line numbers are in the exact same order that you’re going to see them on the platform signs.

It’s all about reducing the cognitive load that it takes to translate the physical world you have to navigate through into the digital world represented by Maps.

Bottom line

The new version of Apple Maps will be in preview next week with just the Bay Area of California going live. It will be stitched seamlessly into the ‘current’ version of Maps, but the difference in quality level should be immediately visible based on what I’ve seen so far.

Better road networks, more pedestrian information, sports areas like baseball diamonds and basketball courts, more land cover including grass and trees represented on the map as well as buildings, building shapes and sizes that are more accurate. A map that feels more like the real world you’re actually traveling through.

Search is also being revamped to make sure that you get more relevant results (on the correct continents) than ever before. Navigation, especially pedestrian guidance, also gets a big boost. Parking areas and building details to get you the last few feet to your destination are included as well.

What you won’t see, for now, is a full visual redesign.

“You’re not going to see huge design changes on the maps,” says Cue. “We don’t want to combine those two things at the same time because it would cause a lot of confusion.”

Apple Maps is getting the long awaited attention it really deserves. By taking ownership of the project fully, Apple is committing itself to actually creating the map that users expected of it from the beginning. It’s been a lingering shadow on iPhones, especially, where alternatives like Google Maps have offered more robust feature sets that are so easy to compare against the native app but impossible to access at the deep system level.

The argument has been made ad nauseam, but it’s worth saying again that if Apple thinks that mapping is important enough to own, it should own it. And that’s what it’s trying to do now.

“We don’t think there’s anybody doing this level of work that we’re doing,” adds Cue. “We haven’t announced this. We haven’t told anybody about this. It’s one of those things that we’ve been able to keep pretty much a secret. Nobody really knows about it. We’re excited to get it out there. Over the next year, we’ll be rolling it out, section by section in the US.”

AT&T’s low-cost TV streaming service Watch TV goes live

AT&T’s newly announced Watch TV, a low-cast live TV streaming service announced in the wake of the AT&T / Time Warner merger, is now up and running. The company already has one over-the-top streaming service with DirecTV Now, but this one is cheaper, has some restrictions, and doesn’t include local channels or sports to keep costs down.

At $15 per month, the service undercuts the existing low-cost leader Philo by a dollar, but offers a different lineup (Fomopop has a nice channel-by-channel comparison between the two, if you’re in the market.)

Both have 25 of the same channels in their packages, including A&E, AMC, Comedy Central, Food Network, Discovery, HGTV, History and others, but AT&T Watch is missing MTV, Nickelodeon, and Travl Channel.

In total, Watch TV has over 30 live TV channels, plus 15,000+ TV  shows and movies on demand, and allows you to subscribe by way of updated AT&T Wireless plans. Non-AT&T customers can subscribe for $15 per month directly.

AT&T has been monkeying around with its wireless plans to best take advantage of its Time Warner acquisition. With the new unlimited plans, it removed the previously free HBO perk and raised the entry-level plan by $5 per month, Ars Technica reported, detailing the changes that coincided with the launch of Watch TV. (Existing customers were grandfathered in to free HBO.)

Instead, wireless customers on the top-tier AT&T Unlimited & More Premium plan can choose to add on another option – like HBO – for free. Other services they can opt for instead include Showtime, Starz, Amazon Music Unlimited, Pandora Premium and VRV.

The company also quietly raised its “administrative fee” for postpaid wireless customers from $0.76 to $1.99 per month, Ars noted as well, citing BTIG Research. This will bring in $800 million of incremental service revenue per year, the analyst firm said.

Despite the price hikes and valid concerns over AT&T’s behavior, there’s likely going to be a market for this low-cost live TV service. The company’s DirecTV Now streaming service, launched in December 2016, reached 1.46 million subscribers in April. It’s catching up to longtime leader, Dish’s Sling TV, which debuted at CES back in January 2015 and now has 2.3 million subscribers. Other newer arrivals, like Hulu with Live TV and YouTube TV, have subscribers in the hundreds of thousands.

AT&T’s Watch TV service will be available across platforms, including iOS, Android, Apple TV, Chromecast and Amazon Fire TV/Fire TV Stick, according to the service’s website. However, it only streams in high-def on the Premium wireless plan. It also doesn’t offer perks common to other live TV services, like a cloud DVR or support for multiple simultaneous streams.

The Watch TV apps are rolling out now. Early reviews note there’s some similarity in the layout to DirecTV Now. There are no reports of crashing as of yet, which are common to new launches like this.

Thursday, 28 June 2018

Bird has officially raised a whopping $300M as the scooter wars heat up

And there we have it: Bird, one of the emerging massively-hyped Scooter startups, has roped in its next pile of funding by picking up another $300 million in a round led by Sequoia Capital.

The company announced the long-anticipated round this morning, with Sequoia’s Roelof Botha joining the company’s board of directors. This is the second round of funding that Bird has raised over just the span of a few months, sending it from a reported $1 billion valuation in May to a $2 billion valuation by the end of June. In March, the company had a $300 million valuation, but the Scooter hype train has officially hit a pretty impressive inflection point as investors pile on to get money into what many consider to be the next iteration of resolving transportation at an even more granular level than cars or bikes. New investors in the round include Accel, B Capital, CRV, Sound Ventures, Greycroft and e.ventures, and previous investors Craft Ventures, Index Ventures, Valor, Goldcrest, Tusk Ventures, and Upfront Ventures are also in the round. (So, basically everyone else who isn’t in competitor Lime.)

Scooter mania has captured the hearts of Silicon Valley and investors in general — including Paige Craig, who actually jumped from VC to join Bird as its VP of business —with a large amount of capital flowing into the area about as quickly as it possibly can. These sort of revolving-door fundraising processes are not entirely uncommon, especially for very hot areas of investment, though the scooter scene has exploded considerably faster than most. Bird’s round comes amid reports of a mega-round for Lime, one of its competitors, with the company reportedly raising another $250 million led by GV, and Skip also raising $25 million.

“We have met with over 20 companies focused on the last mile problem over the years and feel this is a multi-billion dollar opportunity that can have a big impact in the world,” CRV’s Saar Gur, who did the deal for the firm, said. “We have a ton of conviction that this team has original product thought (they created the space) and the execution chops to build something extremely valuable here. And we have been long term focused, not short term focused, in making the investment. The “hype” in our decision (the non-zero answer) is that Bird has built the best product in the market and while we kept meeting with more startups wanting to invest in the space – we kept coming back to Bird as the best company.  So in that sense, the hype from consumers is real and was a part of the decision. On unit economics: We view the first product as an MVP (as the company is less than a year old) – and while the unit economics are encouraging, they played a part of the investment decision but we know it is not even the first inning in this market.”

There’s certainly an argument to be made for Bird, whose scooters you’ll see pretty much all over the place in cities like Los Angeles. For trips that are just a few miles down wide roads or sidewalks, where you aren’t likely to run into anyone, a quick scan of a code and a hop on a Bird may be worth the few bucks in order to save a few minutes crossing those considerably long blocks. Users can grab a bird that they see and starting going right away if they are running late, and it does potentially alleviate the pressure of calling a car for short distances in traffic, where a scooter may actually make more sense physically to get from point A to point B than a car.

There are some considerable hurdles going forward, both theoretical and in effect. In San Francisco, though just a small slice of the United States metropolitan area population, the company is facing significant pushback from the government and scooters for the time being have been kicked off the sidewalks. There’s also the looming shadow of what may happen regarding changes in tariffs, though Gur said that it likely wouldn’t be an issue and “the unit economics appear to be viable even if tariffs were to be added to the cost of the scooters.” (Xiaomi is one of the suppliers for Bird, for example.)

Apple could bundle TV, music and news in a single subscription

According to a report from The Information, Apple could choose to bundle all its media offerings into a single subscription. While Apple’s main media subscription product is currently Apple Music, it’s no secret that the company is investing in other areas.

In particular, Apple has bought the distribution rights of many TV shows. But nobody knows how Apple plans to sell those TV shows. For instance, you could imagine paying a monthly fee to access Apple’s content in the TV app on your iPhone, iPad and Apple TV.

In addition to that, Apple acquired Texture back in March. Texture lets you download and read dozens of magazines with a single subscription. The company has partnered with Condรฉ Nast, Hearst, Meredith, News Corp., Rogers Communications, and Time Inc. to access their catalog of magazines

Texture is still available, but it’s clear that Apple has bigger plans. In addition to reformatting and redistributing web content in the Apple News app, the company could add paid content from magazines.

Instead of creating three different subscriptions (with potential discounts if you subscribe to multiple services), The Information believes that Apple is going to create a unified subscription. It’s going to work a bit like Amazon Prime, but without the package deliveries.

For a single monthly or annual fee, you’ll be able to access Apple Music, Apple TV’s premium content and Apple News’ premium content.

Even if you don’t consume everything in the subscription, users could see it as a good value, which could reduce attrition.

With good retention rates and such a wide appeal, it could help Apple’s bottom line now that iPhone unit sales are only growing by 0.5 percent year over year. It’s still unclear when Apple plans to launch its TV and news offerings.

Apple buries the hatchet with Samsung but could tap LG displays

After years of legal procedures, Apple and Samsung have reached an agreement in the infamous patent case. Terms of the settlement were undisclosed. So is everything clear between Samsung and Apple? Not so fast, as Bloomberg reports that Apple wants to use OLED displays from LG to reduce its dependence on Samsung.

You might remember that Apple first sued Samsung for copying the design of the iPhone with early Samsung Galaxy phones. The first trial led to an Apple victory. Samsung had to pay $1 billion.

But the U.S. Patent and Trademark Office later invalidated one of Apple’s patents. It led to multiple retrials and appeals, and the Supreme Court even had to rule at some point.

After many years, Samsung ended up owing $539 million to Apple. According to Reuters, Samsung has already paid $399 million.

If you look closely at the original case, it feels like it happened many decades ago. At some point, the Samsung Galaxy S 4G, the Nexus S and a few other devices looked a lot like the iPhone 3G.

But now, it’s hard to say that Samsung is copying Apple. For instance, Samsung is one of the only phone manufacturers that hasn’t switched to a notch design. The Samsung Galaxy S9 and the rest of the product lineup still features a rectangular display . Huawei, LG, OnePlus, Asus and countless of others sell devices with a notch.

That could be the reason why it seems weird to spend all this money on legal fees for things that are no longer true.

And yet, the irony is that Apple and Samsung are the perfect example of asymmetric competition. They both sell smartphones, laptops and other electronics devices. But they also work together on various projects.

In particular, the iPhone X is the first iPhone with an OLED display. It’s a better display technology compared to traditional LCD displays. It’s also one of the most expensive components of the iPhone X.

According to Bloomberg, Apple wants to find a second supplier to drive component prices down. And that second supplier is LG.

LG already manufactures OLED displays. But it’s difficult to meet Apple’s demands when it comes to the iPhone. Apple sells tens of millions of smartphones every year. So you need to have a great supply chain to be able to become an Apple supplier. LG could be ramping up its production capacity for future iPhone models.

According to multiple rumors, Apple plans to ship an updated iPhone X with an OLED display as well as a bigger iPhone. The company could also introduce another phone with an edge-to-edge LCD display with a notch and a cheaper price.

There’s one thing for sure, it’ll take time to switch the entire iPhone lineup to OLED displays.

Instead of points, Bumped gives equity in the companies you shop at

What does brand loyalty even mean anymore? App downloads, points, stars, and other complex reward systems have not just spawned their own media empires trying to decipher them, they have failed at their most basic objective: building a stronger bond between a brand and its consumers.

Bumped wants to reinvent the loyalty space by giving consumers shares of the companies they shop at. Through Bumped’s app, consumers choose their preferred retailer in different categories (think Lowe’s vs The Home Depot in home improvement), and when they spend money at that store using a linked credit card, Bumped will automatically give them ownership in that company.

The startup, which is based in Portland and was founded in March 2017, announced the beta launch of its service today, as well as a $14.1 million series A led by Dan Ciporin at Canaan Partners, along with existing seed investors Peninsula Ventures, Commerce Ventures, and Oregon Venture Partners.

Bumped is a brokerage, and the company told me that it has passed all FINRA and SEC licensing. When consumers spend money at participating retailers, they receive bona fide shares in the companies they shop at. Each retailer determines a loyalty percentage rate, which is a minimum of 1% and can go up to 5%. Bumped then buys shares off the public market to reward consumers, and in cases where it needs to buy fractional shares, it will handle all of those logistics.

Bumped’s app allows users to track their shares

For founder and CEO David Nelsen, the startup doesn’t just make good business sense, it can have a wider social impact of democratizing access to the public equity markets. “A lot of brands need to build an authentic relationship with the customers,” he explained to me. “The brands that have a relationship with consumers, beyond price, are thriving.” With Bumped, Nelsen’s goal is to “align the interests of a shareholder and consumer, and everybody wins.”

His mission is to engage more Americans into the equity markets and the power of ownership. He notes that far too many people fail to setup their 401k, and don’t invest regularly in the stock market, citing a statistic that only 13.9% of people directly own a share of stock. By offering shares, he hopes that Bumped engages consumers to think about their relationship to companies in a whole new way. As Nelsen put it, “we are talking about bringing a whole new class of shareholders into the market.”

This isn’t the first time that Nelsen has built a company in the loyalty space. He previously was a co-founder and CEO of Giftango, a platform for prepaid digital gift cards that was acquired by InComm in late 2012.

Consumers will have to choose their Bumped loyalty partner in each category, like burgers

That previously experience has helped the company build an extensive roster for launch. Bumped has 19 brands participating in the beta, including Chipotle, Netflix, Shake Shack, Walgreens, and The Home Depot. Another 6 brands are currently papering contracts with the firm.

Ciporin of Canaan said that he wanted to fund something new in the loyalty space. “There has been just a complete lack of innovation in the loyalty space,” he explained to me. “I think about it as Robinhood meets airline points programs.” One major decider for Ciporin in making the investment was academic research, such as this paper by Jaakko Aspara, showing that becoming a shareholder in a company tended to make consumers significantly more loyal to those brands.

In the short run, Bumped heads into a crowded loyalty space that includes companies like Drop, which I have covered before on TechCrunch. Nelsen believes that the stock ownership model is “an entirely different mechanism” in loyalty, and that makes it “hard to compare” to other loyalty platforms.

Longer term, he hints at exploring how to offer this sort of equity loyalty model to small and medium businesses, a significantly more complex challenge given the lack of liquid markets for their equity. Today, the company is exclusively focused on publicly-traded companies.

Bumped today has 14 people, and is targeting a team size of around 20 employees.

Wednesday, 27 June 2018

Samsung will probably unveil the Note 9 on August 9

Those Galaxy Note 9 rumors have been coming fast and furious in recent weeks, and now we know why. Samsung just sent out invites for its next big event in New York City, and its beloved phablet seems all but guaranteed to show up. The timeframe certainly lines up.

The pen-enabled device was first announced at IFA back in 2011, and while the company has moved away from the trade show toward its own stage in recent years, announcements have more or less stayed within that August/September timeframe. And holding the event on August 9, well, that’s likely more than just a numerological coincidence. As if all that weren’t confirmation enough, the handset appears to have also recently passed through the FCC (alongside theTab S3 tablet), a surefire sign that it’s just over the horizon.

The phone was the subject of a big leak earlier this week, that hinted at an update to line’s iconic S Pen stylus. Exact details are pretty thin at the moment, though one leaker called it “the biggest update” in the peripheral’s history, for what that’s worth. And the close up shot on this morning’s invites do appear to confirm a focus on the stylus. Samsung has refined the S Pen’s writing system in the seven years since the first device was announced, but it’s largely taken a backset to things like screen design and camera specs.

Otherwise, however, Note 9 reports paint a picture of fairly minor upgrades over the Note 8, with plenty of features cribbed from the S9 announced back in February at Mobile World Congress.

 

 

 

Facebook tests 30-day keyword snoozing to fight spoilers, triggers

Don’t want to know the ending to a World Cup game or Avengers movie until you’ve watched it, or just need to quiet an exhausting political topic like “Trump”? Facebook is now testing the option to “snooze” specific keywords so you won’t see them for 30 days in News Feed or Groups. The feature is rolling out to a small percentage of users today. It could make people both more comfortable browsing the social network when they’re trying to avoid something, and not feel guilty posting about sensitive topics.

The feature was first spotted in the Facebook’s app’s code by Chris Messina on Sunday, who told TechCrunch he found a string for “snooze keywords for 30 days”. We reached out to Facebook on Monday, which didn’t initially respond, but last night provided details we could publish at 5am this morning ahead of an official announcement later today. The test follows the roll out of snoozing people, Pages, and Groups from last December.

To snooze a keyword, you first have to find a post that includes it. That kind of defeats the whole purpose since you might run into the spoiler you didn’t want to see. But when asked about that problem, a Facebook spokesperson told me the company is looking into adding a preemptive snooze option in the next few weeks, potentially in News Feed Preferences. It’s also considering a recurring snooze list so you could easily re-enable hiding your favorite sports team before any game you’ll have to watch on delay.

For now, though, when you see the word you can hit the drop-down arrow on the post which will reveal an option to “snooze keywords in this post”. Tapping that reveals a list of nouns from the post you might want to nix, without common words like “the” in the way. So if you used the feature on a post that said “England won its World Cup game against Tunisia! Yes!”, the feature would pull out “World Cup”, “England”, and “Tunisia”. Select all that you want to snooze, and posts containing them will be hidden for a month. Currently, the feature only works on text, not images, and won’t suggest synonyms you might want to snooze as well.

The spokesperson says the feature “was something that kept coming up” in Facebook interviews with users. The option applies to any organic content, but you can’t block ads with it, so if you snoozed “Deadpool” you wouldn’t see posts from friends about the movie but still might see ads to buy tickets. Facebook’s excuse for this is that ads belong to a “a separate team, separate algorithm” but surely it just doesn’t want to open itself up to users mass-blocking its revenue driver. The spokesperson also said that snoozing isn’t currently being used for other content and ad targeting purposes.

We asked why users can’t permanently mute keywords like Twitter launched in November 2016, or the way Instagram launched keyword blocking for your posts’ comments in September 2016. Facebook says “If we’re hearing from people that they want more or less time” that might get added as the feature rolls out beyond a test. There is some sense to defaulting to only temporary muting, as users might simply forget they blocked their favorite sports team before a big game, and then wouldn’t see it mentioned forever after.

But when it comes to abuse, permanent muting is something Facebook really should offer. Instead it’s relied on users flagging abuse like racial slurs, and it recently revealed its content moderation guidelines. Some topics that are fine for others could be tough for certain people to see, though, and helping users prevent trauma probably deserves to be prioritized above stopping reality TV spoilers.

Tuesday, 26 June 2018

Instagram now lets you 4-way group video chat as you browse

Instagram’s latest assault on Snapchat, FaceTime, and Houseparty launches today. TechCrunch scooped back in March that Instagram would launch video calling, and the feature was officially announced in at F8 in May. Now it’s actually rolling out to everyone on iOS and Android, allowing up to four friends to group video call together through Instagram Direct.

With the feed, Stories, messaging, Live, IGTV, and now video calling, Instagram is hoping to become a one-stop-shop for its 1 billion users’ social needs. This massive expansion in functionality over the past two years is paying off according to SimilarWeb, which estimates that the average US user has gone from spending 29 minutes per day on the app in September 2017 to 55 minutes today. More time spent means more potential ad views and revenue for the Facebook subsidiary that a Bloomberg analyst just valued at $100 billion after it was bought for less than $1 billion in 2012.

One cool feature of Instagram Video Calling is that you can minimize the window and bounce around the rest of Instagram without ending the call. That opens new opportunities for co-browsing with friends as if you were hanging out together. It also makes it more functional than Bonfire. However, Bonfire allows for unlimited number of video chat partners across different rooms, while Snapchat launched group video calling with up to 16 friends in April. Facebook seems to have the technology to allow more chat partners, though, since Messenger can do six-way calls with up to 50 friends listening in over audio. More friends can join an Instagram call in progress, though you can mute them if you don’t want to get more call invites. You’re allowed to call anyone you can Direct message by hitting the video button in a chat, and blocked people can’t call you.

Instagram is also rolling out two more features promised at F8. The Explore page will now be segmented to show a variety of topic channels that reveal associated content below. Previously, Explore’s 200 million daily users just saw a random mish-mash of popular content related to their interests, with just a single “Videos You Might Like” section separated.

Now users will see a horizontal tray of channels atop Explore, including an algorithmically personalized For You collection, plus ones like Art, Beauty, Sports, and Fashion depending on what content you regularly interact with. Users can swipe between the categories to browse, and then scroll up to view more posts from any they enjoy. A list of sub-hashtags appears when you open a category, like #MoGraph (motion graphics) or #Typeface when you open art. And if you’re sick of seeing a category, you can mute it. Strangely, Instagram has stripped Stories out of Explore entirely, but when asked, the team told us it plans to bring Stories back in the near future.

The enhanced Explore page could make it easier for people to discover new creators. Growing the audience of these content makers is critical to Instagram as it strives to be their favorite app amongst competition. Snapchat lacks a dedicated Explore section or other fan base-growing opportunities, which has alienated some creators, while the new Instagram topic channels is reminiscent of YouTube’s mobile Trending page.

Instagram’s new Explore Channels (left) vs YouTube’s Trending page (right)

Finally, Instagram is rolling out Camera Effects design by partners, starting with Ariana Grande, BuzzFeed, Liz Koshy, Baby Ariel, and the NBA. If you’re following these accounts, you’ll see their effect in the Stories camera, and you can hit Try It On if you spot a friend using one you like. This opens the door to accounts all offering their own augmented reality and 2D filters without the Stories camera becoming overstuffed with lenses you don’t care about.

Instagram’s new partner-made camera effects

What’s peculiar is that all of these features are designed to boost the amount of time you spend on Instagram just as it’s preparing to launch a Usage Insights dashboard for tracking if you’re becoming addicted to the app. At least the video calling and camera effects promote active usage, but Explore definitely encourages passive consumption that research shows can be unhealthy.

Therein lies the rub of Instagram’s mission and business model with its commitment to user wellbeing. Despite CEO Kevin Systrom’s stated intention that  “any time [spent on his app] should be positive and intentional“ and that he wants Instagram to “be part of the solution”, the company earns more by keeping people glued to the screen rather than present in their lives.

The new Google Maps with personalized recommendations is now live

At its I/O developer conference last month, Google previewed a major update to Google Maps that promised to bring personalized restaurant recommendations and more to the company’s mapping tool. Today, many of these new features started rolling out to Google Maps users.

The core Google Maps experience for getting directions hasn’t changed, of course, but the app now features a new ‘explore’ tab that lets you learn more about what’s happening around you, as well as a ‘for you’ tab that provides you with recommendations for restaurants, lists of up and coming venues, and the ability to ‘follow’ neighborhoods and get updates when there are new restaurants and cafes that you would probably like. The main difference between the Explore and For You tabs is that the former is all about giving you recommendations for right now, while the latter is more about planning ahead and keeping tabs on an area in the long run.

While most of the other features are rolling out to all users worldwide, the new ‘for you’ tab and the content in it is only available in the U.S., U.K., Canada, Australia and Japan for now. Content in this tab is still a bit limited, too, but Google promises that it’ll ramp up content over the course of this week.

Both of the new tabs feature plenty of new features. There is the ‘foodie list,’ for example, which shows you the hottest new restaurants in an area. And if you feel completist, Google will keep track of which one of these places you’ve been to and which ones you still have to visit. Like before, the Explore tab also features automatically curated lists of good places to go to for lunch, with kids or for a romantic dinner. It’s not just about food and coffee (or tea), though, those lists also include other activities and Google Maps can now also highlight local events.

With this launch, Google is also releasing its new ‘Your Match’ scores, which assigns a numeric rating to each restaurant or bar, depending on your previous choices and ratings. The idea here is that while aggregate ratings are often useful, your individual taste often differs from the masses. With this new score, Google tries to account for this. To improve these recommendations, you can now also explicitly tell Maps which cuisines and restaurants you like.

It’s worth noting that there are still some features that Google promised at I/O that are not part of this release. Group planning, for example, which allows you to create a list of potential meet-up spots and lets your friends vote on them, is not part of this release.

[gallery ids="1663429,1663430,1663431"]

The updated Google Maps for iOS and Android is now available in the Play Store and App Store.

If you’d like to read more about Google’s rationale for many of these changes, also take a look at our in-depth interview with Sophia Lin, Google’s senior product manager on the Google Maps team, from I/O.