Facebook can now sync your Instagram contacts to Messenger

Facebook wants to expand your Messenger contact list with a little help from Instagram. The company has launched a feature in Messenger that pulls in your contacts from Instagram, if you opt to connect your account. The option appears in Messenger’s “People” tab, alongside the existing option to sync your phone’s contacts with Messenger.

The feature was first spotted by Jane Manchun Wong, who posted a screenshot to Twitter.

https://platform.twitter.com/widgets.js

Others outside the U.S. noticed the option as well.

https://platform.twitter.com/widgets.js

We also found the option enabled in our own Messenger app, and have now confirmed with Facebook it’s a full public launch.

When you tap on “Connect Instagram,” Messenger adds contacts from Instagram automatically. In addition, your Instagram username and account also then becomes visible to other people on Messenger.

The result is an expanded social graph of sorts — one that combines the friends and family you know from Facebook, with those you know from Instagram.

Not everyone is thrilled with the feature, however.

As one Twitter user pointed out, it’s not clear that pushing “Connect Instagram” (the button’s title that appeared to some), means Messenger will automatically add your Instagram contacts to Messenger. It seems that you should be given a choice here as to if you want to add them, but that’s not the case.

https://platform.twitter.com/widgets.js

In December 2017, TechCrunch spotted a very similar option to sync Instagram contacts to Messenger in the same People section. However, the option never launched to the public and later disappeared. But the recent re-emergence of the feature is not a continued test — it’s now rolled out, Facebook says.

This is not the first time Facebook has added integrations between its apps.

For example, in 2016 it gave businesses access to a unified inbox of conversations from across its platforms, including Facebook, Instagram and Messenger. Last year, it also tested a cross-app notification feature. There’s even an option to launch Facebook right in Instagram itself, via an icon on your Instagram profile page.

The timing of the launch is notable, given that Instagram’s own Direct Messaging service has become a popular communications service of its own.

Instagram Direct as of April 2017 had 375 million users, and was spun off into its own standalone app last year in select countries outside the U.S. With so many users now messaging through Facebook-owned Instagram, it’s clear that Facebook wants to capitalize on that activity to grow its own Messenger app, too.

from Social – TechCrunch https://techcrunch.com/2018/07/18/facebook-can-now-sync-your-instagram-contacts-to-messenger/
via Superb Summers

Advertisements

Reddit expands chat rooms to more subreddits

If you’d rather spend time chatting with strangers who share a hyper-specific interest rather than keeping up with your coworkers’ stale memes on Slack, Reddit is ready for you. The platform has quietly been working on a chat room feature for months now and today it expands beyond its early days as a very limited closed beta.

Plenty of subreddits already make use of a chat room feature, but these live outside of Reddit, usually on Slack or Discord. Given that, it makes sense for Reddit to lure those users back into the engaging on Reddit itself by offering its own chat feature.

I spent a little time hanging out in the /r/bjj (brazilian jiu jitsu) chat as well as the a psychedelics chat affiliated with r/weed to see how things went across the spectrum and it was pretty chill — mostly people asking for general advice or seeking answers to specific questions. In a Reddit chat linked to the r/community_chat subreddit — the hub for the new chat feature — redditors discussed if the rooms would lead to more or less harassment and if the team should add upvotes, downvotes and karma to chat to make it more like Reddit’s normal threads. Of course, what I saw is probably a far cry from what chat will look like if and when some of its more inflammatory subreddits get their hands on the new feature. We’ve reached out to Reddit with questions about if it will allow all subreddits, even the ones hidden behind content warnings, will be offered the new chat functionality.

Chat rooms are meant as a supplement to already active subreddits, not a standalone community, so it’s basically like watching a Reddit thread unfold in realtime. On the Reddit blog, u/thunderemoji writes about why Reddit is optimistic that chat rooms won’t just be another trolling tool:

“I was initially afraid that most people would bring out the pitchforks and… unkind words. I was pleasantly surprised to find that most people are actually quite nice. The nature of real-time, direct chat seems to be especially disarming. Even when people initially lash out in frustration or to troll, I found that if you talk to them and show them you’re a regular human like them, they almost always chill out.

“Beyond just chilling out, people who are initially harsh or skeptical of new things will actually often change their minds. Sometimes they get so excited that they start to show up in unexpected places defending the thing they once strongly opposed in a way that feels more authentic than anything I could say.”

While a few qualitative experiences can only go so far to allay fears, Reddit’s chat does have a few things going for it. For one, moderators add chat rooms. If a subreddit’s mods don’t they don’t think they can handle the additional moderation, they don’t have to activate the feature. (A Wired piece on the thinking behind chat explores some of these issues in more depth.)

In the same post, u/thunderemoji adds that Reddit “made moderation features a major priority for our roadmap early in the process” so that mods would have plenty of tools at their disposal. Those tools include an opt-in process, auto-banning users from chat who are banned from a subreddit, “kick” tools that suspend a user for 1 minutes, 1 hour, 1 day or 3 days, the ability to lock a room and freeze all activity, rate limits and more.

To sign up for chat rooms (mods can add as many as they’d like once approved), a subreddit’s moderators can add their name to a list that lives here. To find chat rooms to explore, you can check for a link on subreddits you already visit, poke around the sidebar in this post by Reddit’s product team or check out /r/SubChats, a dedicated new subreddit collecting active chat rooms that accompany interest and community-specific subreddits.

from Social – TechCrunch https://techcrunch.com/2018/07/18/reddit-chat-rooms/
via Superb Summers

Undercover report shows the Facebook moderation sausage being made

An undercover reporter with the UK’s Channel 4 visited a content moderation outsourcing firm in Dublin and came away rather discouraged at what they saw: queues of flagged content waiting, videos of kids fighting staying online, orders from above not to take action on underage users. It sounds bad, but the truth is there are pretty good reasons for most of it and in the end the report comes off as rather naive.

Not that it’s a bad thing for journalists to keep big companies (and their small contractors) honest, but the situations called out by Channel 4’s reporter seem to reflect a misunderstanding of the moderation process rather than problems with the process itself. I’m not a big Facebook fan, but in the matter of moderation I think they are sincere, if hugely unprepared.

The bullet points raised by the report are all addressed in a letter from Facebook to the filmmakers. The company points out that some content needs to be left up because abhorrent as it is, it isn’t in violation of the company’s stated standards and may be informative; underage users and content has some special requirements but in other ways can’t be assumed to be real; popular pages do need to exist on different terms than small ones, whether they’re radical partisans or celebrities (or both); hate speech is a delicate and complex matter that often needs to be reviewed multiple times; and so on.

The biggest problem doesn’t at all seem to be negligence by Facebook: there are reasons for everything, and as is often the case with moderation, those reasons are often unsatisfying but effective compromises. The problem is that the company has dragged its feet for years on taking responsibility for content and as such its moderation resources are simply overtaxed. The volume of content flagged by both automated processes and users is immense and Facebook hasn’t staffed up. Why do you think it’s outsourcing the work?

By the way, did you know that this is a horrible job?

Facebook in a blog post says that it is working on doubling its “safety and security” staff to 20,000, among which 6,500 will be on moderation duty. I’ve asked what the current number is, and whether that includes people at companies like this one (which has about 650 reviewers) and will update if I hear back.

Even with a staff of thousands the judgments that need to be made are often so subjective, and the volume of content so great, that there will always be backlogs and mistakes. It doesn’t mean anyone should be let off the hook, but it doesn’t necessarily indicate a systematic failure other than, perhaps, a lack of labor.

If people want Facebook to be effectively moderated they may need to accept that the process will be done by thousands of humans who imperfectly execute the task. Automated processes are useful but no replacement for the real thing. The result is a huge international group of moderators, overworked and cynical by profession, doing a messy and at times inadequate job of it.

from Social – TechCrunch https://techcrunch.com/2018/07/17/undercover-report-shows-the-facebook-moderation-sausage-being-made/
via Superb Summers