Travala conducts AMA to answer community questions about chain migration announcement

However, the team aims to utilize Cosmos’ Inter Blockchain Communication (ICB) protocol for future smart contract functionality. Further, Travala aims …
Share on Twitter
Share on Facebook
Share on Reddit
Share via Email
Share on LinkedIn
Share via RSS
Share on Google+

On Monday, September 9th, Travala conducted an ask me anything (AMA) on its sub-Reddit to answer questions surrounding its recent announcement to migrate to the Binance Chain. Participants were able to propose two questions per post, which received responses from Travala within 48 hours of the beginning of the AMA. At the time of press, the AMA thread is locked.

Many participants sought to learn the reasoning behind migrating to the Binance Chain. A Travala representative, “cryptokyle,” responded, “we first considered moving to another chain after the series of reliability issues from the NEO blockchain.” He went on to say, “we have been looking into the Binance Chain before the NEO investment.” Referring to the EcoFund investment Travala accepted from NEO Global Development (NGD) this past July.

Travala states it had decided to move to the Binance Chain to “maximize [AVA] token economics, usability, security, and speed.” Further, the team states it sought to “align with an industry leader that [it believes] will help on its mission to allow people to book travel with cryptocurrencies.”

One participant asked, “how are smart contracts going to be implemented on Binance chain after dropping NEO?” Travala acknowledged that Binance Chain does not currently support Turing-complete smart contracts onchain. However, the team aims to utilize Cosmos’ Inter Blockchain Communication (ICB) protocol for future smart contract functionality. Further, Travala aims to port what it has built on the NEO blockchain to Tendermint BFT and ICB.

Regarding a token swap following the migration, Travala states “there will be a very long runway” to swap NEP-5 AVA for the BEP2 token standard. The team aims to publish a “clear guide” on the initial partial migration to Binance Chain, as well as when the full migration occurs.

Note: NEP-5 AVA that is not swapped “within the defined timeline” will be burned if users do not migrate.

The full AMA can be found at the link below:

Homeis raises $12 million to foster online immigrant communities

Following a $4 million seed round, the New York- and Israel-based startup raised $12 million in series A funding led by Canaan and Spark Capital, …

Immigrating to a new country is not easy. Culture shock is all too common for the hundreds of millions who leave their birthplace each year in search of better opportunities, as unfamiliar customs, architecture, and dialects become the rule rather than the exception. It’s this dynamic that draws recently installed residents to ethnic enclaves in cities like New York, Houston, Los Angeles, and San Francisco, where local communities serve as links to far-flung places of origin.

There’s been no true online corollary, however, save loosely affiliated groups on Reddit, WhatsApp, Facebook, and other social media platforms. That’s why Ran Harnevo, former president of video for media company Oath, and Hanan Laschover, former CEO of AOL Israel, in 2017 teamed up to launch Homeis, which the pair describe as a “culture network” for foreign-born nationals and expats. Following a $4 million seed round, the New York- and Israel-based startup raised $12 million in series A funding led by Canaan and Spark Capital, bringing its total raised to about $16 million.

Harnevo, who serves as CEO, said the fresh capital will enable Homeis to expand its service to the roughly 250 million immigrants around the world, starting with Mexican and other Latinx communities in the U.S. Homeis currently serves “hundreds of thousands” of Indian, Israeli, French, and other communities domestically.


“Empowering immigrants is a personal mission, one which we believe is timely and important,” added Harnevo, who previously cofounded 5min, a syndication platform for lifestyle and instructional videos that was acquired by AOL for $65 million.

Homeis’ eponymous app for Android and iOS encourages users to share experiences with (and contribute knowledge to) their communities. Users start by answering two questions at first login: the country they’re originally from and the city they live in. From that point forward, they’re afforded access to vetted and community-created social and utility resources that help them network; organize meetings; arrange appointments with doctors, lawyers, and other professionals; and foster connections with independently run businesses and restaurants.

Homeis curates guides to various aspects of expat life, like opening bank accounts, filling out visa applications, and filing taxes for the first time. And the app hosts a marketplace where community members can pick up items they’re looking for or sell belongings they wish to get rid of. Separately, Homeis users can also create events or groups to connect with others over shared hobbies and interests, and they’re able to filter feeds and recommendations so those that interest them bubble to the top.

“We are thrilled to have the support of our investors and are committed to supporting immigrants wherever they are, making technology work for them and fulfilling their very special needs,” said Harnevo.

Backers include Abstract VC, the Chernin Group, Samsung Next, Reddit cofounder Alexis Ohanian, former AOL CEO Tim Armstrong, and Taboola CEO Adam Singolda.

Sign up for Funding Daily: Get the latest news in your inbox every weekday.

Related Posts:

  • No Related Posts

Unexpected skincare trends turn up in NLP reads of social media

The team used an unsupervised AI version of an NLP method called Latent Dirichlet Allocation (LDA), and it captured enlightening discussions of …

“Our results demonstrate that Reddit specifically is an effective source of user-generated information on various dermatology-related issues,” Lipoff et al. concluded. “While these data did not highlight any one disease for physicians to follow up with formal study, we did identify many trends in patient engagement.”

The authors acknowledged as a limitation in their study design the use of LDA, which, as an unsupervised model, lacks a source of ground truth to which the model’s output can be compared.

“However, as these forums are anonymous, there seems little incentive for patients to be dishonest,” the authors noted, adding that Reddit data have “viability and utility for dermatologic research and engagement with the public.”

Related Posts:

  • No Related Posts

How cryptocurrency discussions spread

The team measured the speed and scale of discussion spread related to Bitcoin, Ethereum, and Monero cryptocurrencies. Findings include:.

RICHLAND, WASHINGTON, June 10, 2019 – A rapidly increasing percentage of the world’s population is connected to the global information environment. At the same time, the information environment is enabling social interactions that are radically changing how and at what rate information spreads. As part of an effort to understand communication patterns and build a quantitative framework for how this information expands online, researchers at Pacific Northwest National Laboratory, a United States Department of Energy national laboratory, recently examined cryptocurrency discussion threads on Reddit. Their findings, presented at the Web Conference 2019, not only shed light on how cryptocurrency discussions spread but could inform artificial intelligence applications for modeling information spread across internet social environments to help identify and model criminal activities by state and non-state actors exploiting cryptocurrencies.

Every day, thousands of messages on Reddit [and elsewhere] contain discussions of cryptocurrencies. Some of these trigger follow-up discussions. Some lead to increased interest in a cryptocurrency — not just a discussion of that cryptocurrency, but its actual price/value. Clearly, not all cryptocurrencies are equivalent — and analyzing a set of them can reveal how bad actors might exploit those differences.

Led by PNNL data scientist, Dr. Svitlana Volkova, the team analyzed three years of discussions on Reddit from January 2015 to January 2018. The team measured the speed and scale of discussion spread related to Bitcoin, Ethereum, and Monero cryptocurrencies. Findings include:

  • On average, there are 3,600 comments posted each day for Bitcoin; 500 for Ethereum; and 380 for Monero.
  • Not every post triggered a discussion. The rate of ineffective posts–posts that do not receive any additional commentary–is up to 5x higher with Bitcoin and Ethereum than Monero.
  • People respond quicker to Bitcoin posts. For example, it takes on average 11 minutes for someone within the Bitcoin subreddit to comment; 20 minutes for Monero; and 27 minutes for Ethereum.
  • Bitcoin discussions grow the fastest, followed by Ethereum and then Monero.
  • Discussions around Monero have largest median lifetimes. Ethereum discussions have the largest possible lifetimes, and, Bitcoin discussions have the lowest median lifetime of the three coins.

“Cryptocurrencies are quite unique observable units of information in the way discussions about them spread across social platforms,” said Volkova. “These social signals are quite useful, and by incorporating them with machine and deep learning, we intend to build predictive models that hit on causal relationships between different variables so we can explain model decision-making processes.”

The findings gathered by Volkova and her colleagues, PNNL researchers Maria Glenski and Emily Saldanha, provide valuable insights into how information expands and evolves in social platforms. Likewise, by focusing on a specific social phenomenon, the team hopes to compare the spread of cryptocurrency discussions with other viral trends, such as the spread of computer vulnerabilities and misinformation.


Additional information on the team’s findings can be read at: The work was funded by the Defense Advanced Research Projects Agency.

About PNNL

Pacific Northwest National Laboratory draws on signature capabilities in chemistry, earth sciences, and data analytics to advance scientific discovery and create solutions to the nation’s toughest challenges in energy resiliency and national security. Founded in 1965, PNNL is operated by Battelle for the U.S. Department of Energy’s Office of Science. DOE’s Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, visit PNNL’s News Center. Follow us on Facebook, Instagram, LinkedIn and Twitter.

Media Contact:

Nick Hennen

Media Relations Advisor


Related Posts:

  • No Related Posts

“They’re Just Wrong”—Ex-Reddit CEO Ellen Pao Has Harsh Words on Big Tech’s Failure to Stop …

When Pao first came forward with her stories of sexual harassment and discrimination at the venture capital firm, Kleiner Perkins Caufield & Byers …

Mother Jones illustration; Courtesy Ellen Pao

Ellen Pao sounds tired. “I actually haven’t been on Reddit in six months,” she says, a remarkable admission for the company’s former CEO. She says she’s tired; tired of her old colleagues’ decisions on hate speech, tired of similar decisions from other social media giants, and tired of being harassed online.

It’s been almost four years since Pao led Reddit. Since then she’s navigated the fallout of an unsuccessful sexual harassment lawsuit she brought that left her ostracized in Silicon Valley circles. When Pao first came forward with her stories of sexual harassment and discrimination at the venture capital firm Kleiner Perkins Caufield & Byers, where she was a partner, people, men especially, doubted her claims. She became a pariah at the firm and was attacked online, but her case has come to be seen as a forerunner of the #MeToo movement.

It wasn’t the first time she became a target online, having endured already heavy internet harassment following her decision, just before leaving the company in 2015, to ban some of Reddit’s most toxic subreddits.

Since then, social media companies have taken some steps to address rampant abuse, harassment, and hate that’s proliferated on their platforms. But they haven’t stamped it out to the extent that other types of harmful content have been sidelined.

Reddit continues to struggle with where to draw the line on the popular The_Donald subreddit, home to a xenophobic and sometimes harassing group of Trump supporters, and has been unable to stop abusive behavior like brigading, where groups coordinate large troll campaigns by targeting and infiltrating other communities.

Pao now serves as the CEO of Project Include, a nonprofit that advocates diversity in the technology industry. Even though she’s no longer at an internet company, Pao still has a lot of thoughts on what platforms should do to stop the spread of hate and harassment, and she’s disappointed by how little they’ve accomplished.

Our conversation happened not long after the killings at mosques in Christchurch, New Zealand, by a shooter whose internet behavior and manifesto left little doubt he had been radicalized online. Though his writings didn’t make specific reference to Reddit, the site has acted as a bridge between the darker, bigotry-filled recesses of the internet like 4chan and 8chan (where the shooter did post) and mainstream sites like Facebook, Twitter, and YouTube. In the wake of the shooting, the worst subreddits hosted hate-filled homages to the shooters.

Mother Jones talked to the former Reddit CEO about how she thinks her previous industry can take the harassment, abuse, and hate speech it’s enabled more seriously. She’s not confident that they will.

Mother Jones: Reddit seems resistant to banning some of the worst types of subreddits on its platform, including ones that have espoused Islamophobia in the wake of the Christchurch shooting. I was curious to get your take as someone who made key decisions to start banning these types of toxic subreddits.

Ellen Pao: Part of it is fear—the fear that platform employees have of their users drives a lot of decision making in ways you don’t expect. My guess is that there’s a lot of fear of people who are running and using The_Donald. It’s not a subreddit that is following the rules or should actually be there. Part of it is usage: It’s hard to take down a subreddit which is driving a lot of traffic.

But at the end of the day, if you don’t follow your rules, you end up having more people not follow the rules. If your platform doesn’t hold people accountable to the rules then your rules really don’t have that much meaning. You see that not just on Reddit, but on Twitter, where people are allowed to break rules because of newsworthiness, which is very vague. People don’t knowwhat that actually means and you end up with more and more vitriol.

MJ: Those arguments are used for things like President Trump’s Twitter even though he’s probably violated a lot of platform rules. Do you think his account is at a worthwhile level of newsworthiness?

EP: From having run a platform, you have to have one standard—otherwise it really doesn’t mean anything. When you see the impact of the words, when you see the impact of the content, and it leads to harassment and you have experienced that harassment, you have a different perspective. It’s not just words. It’s not just speech. It can be very hurtful.

Thinking about the recent suicides among the people from Parkland and the father from Sandy Hook, this ongoing harassment and abuse has an impact on people. I can’t imagine losing someone and having people say “they’re alive” and that I’m a liar as I’m trying to grieve and overcome what is probably the most horrible experience of my life. This should not be allowed, and I do think it contributed to the father’s death by suicide. I can’t imagine that it didn’t.

There is a strong link between bullying, suicide, self-harm, and suicidal behaviors with harassment online. That’s stuff that people have actually researched and proven. So when you’re in charge of a platform and you allow this kind of behavior, I hope these people are holding themselves accountable for what is happening and what is coming out of their platforms. If they really believe the value of this so-called “public square” or so-called “free speech” is worth it, they should understand what the consequences are and be forced to acknowledge that—in the interest of building these platforms, in the interest of generating revenue, in the interest of generating engagement—people are being harassed to the point of where they actually are suicidal.

MJ: These points are good and important, but at the same time, we’ve been having these types of conversations for the last couple of years, especially since people like you have brought these types of issues to light. Are there are new things in this? Are things changing at all?

EP: I can’t tell. I think there is a kind of acceptance of harassment on these platforms now. It’s become so widespread. More people are experiencing it, unfortunately, and you see it on multiple threads. You get into a hate train or trolling and it’s more prevalent, especially in this political environment and in this environment of fake news, people have started to tolerate it more. People are also starting to understand the platforms’ role in it, are seeing Facebook’s problems with Myanmar, its problems with fake news, its problems with allowing holocaust deniers a platform. It was five years ago when the Southern Poverty Law Center wrote about how Reddit was becoming an on-ramp for Stormfront, one of the worst sites for white supremacy. And nobody seemed to really care. I’d think people would really pay attention to a report like that. People would say, “Oh, this is actually really bad.” People know now white supremacy is a huge problem in our country. It’s no longer something we’ve taken care of and that it’s just some fringe groups.

MJ: You’ve talked about how you think these issues are partially being driven by the lack of representation of marginalized people—women, people of color.

EP: I think it’s fascinating that at the largest platforms we don’t have people of color, women of color—and really not very many women—on the boards, on executive teams. I’m sure there are, but I don’t know a single product leader at Facebook who is female. I think there is one or two, but you don’t hear that much about them. You don’t hear about anybody leading anything at Twitter other than diversity and inclusion. The people who have the positions that are making these decisions, like product, like engineering, they’re predominately white men. And that’s true at Twitter and that’s true at Reddit and that’s true at Facebook.

I think that the inability to connect to the harassment, to feel empathy for the people who are experiencing the harassment at this level, prevents them from actually trying to solve it.

MJ: If you were still at Reddit, are there things you would do there? Are there are subreddits or communities you’d be looking to take enforcement action on right off the bat?

EP: It’s hard. I don’t want to be negative about the way things are being run, but this is a different philosophy. For me, it’s about, “Let’s get a clean platform where everybody can have a conversation and actually have a real conversation and not just have as many people shout at each other as possible.” It’s been a long time since you could really go to Reddit and see that conversations were authentic.

If that’s what Twitter and Reddit are trying to have, then you’ve gotta get rid of people who are shouting people out of the public square, who are really bullying people. That’s not a conversation. You have to make sure everyone has the space to speak—and make sure that the speech isn’t causing damage—and that’s not happening right now.

MJ: I’ve noticed brigading going on where you have different people coordinating raids on a different group, stifling conversations.

EP: One way it plays out is one group of bad actors will brigade a set of people, usually on a subreddit, and if your subreddit is being brigaded, you don’t really have the tools to really stop it. Nobody from the company is really helping you, so you end up brigading the other subreddits. The best defense is a good offense. You use those same patterns of behavior to attack the people who are attacking you because you don’t have any other tools. That is the way that it happens. The people who are defending themselves are totally getting punished because they end up being the ones who attack. You end up with this horrible situation where they weren’t initially the bad actors, they’re just trying to defend themselves and they get totally punished by the system. That just happens time and time again, and it’s a terrible experience for everyone. The subreddits are a mess.

MJ: It seems like there’s been a lot of off-platform coordination with this, especially in places like Discord. And this isn’t just a Reddit issue. It’s happening on Twitter, it’s happening on 4chan. How much culpability do you think those types of places have in this ecosystem?

EP: It’s been happening for ages and across platforms. For a while it was a way for people to avoid being banned. The platforms didn’t really consider what was going on on other platforms. They looked myopically at their own sites. At Reddit in 2015, we changed the rule so that offsite behavior was considered in applying the rules, as well as onsite behavior because there was just too much going on that was outside of Reddit and it was super coordinated and it was a loophole. People could do things elsewhere and then brigade a subreddit. Because we couldn’t show that it was coordinated on our own site, we had a hard time dealing with it when it was clear it was happening on Twitter and other platforms—4chan and 8chan especially. It’s not a new thing.

MJ: How do these issues get better? With government regulation? I know people have talked about that a lot in regard to privacy, but does it make sense with content moderation too?

EP: I don’t know. Part of me hopes that the CEOs actually do want to do something about it and it is the fear of their users that limit them—fear of their boards, fear of investors—that prevents them from doing the right thing and keeping all their users safe. Regulation will give them an excuse to take on people and their bad behavior in a way that they wouldn’t have to explain and defend and perhaps they wouldn’t get as much abuse heaped on them, so they kind of have the justification out of their hands.

I don’t think regulators will do it the right way, though. That’s my big reason why I’m not out there pitching it. But we’re at a point where, how could it be worse?

MJ: Is that because you don’t have much faith in Congress as it currently is or because regulators will miss things because they’re not inside these companies?

EP: A little bit of both. We talk about the lack of diversity in tech companies; our government is getting better but it’s not super diverse either. To understand the actual extent of the behavior you’re trying to prevent requires having people from different backgrounds and I don’t know how that’s actually going to happen in the regulatory area. I don’t know though…It’s just gotten so messy.

MJ: You mentioned that you haven’t been on Reddit in six months. Is there a particular reason for that?

EP: I just—I think a lot of it is, some of the content I don’t really trust.

MJ: Was there a period of time when you did trust the information on Reddit more?

EP: Yeah, maybe 7, 8 years ago. But once you see how the sausage is made, it makes you a vegetarian. You see what happens to people. We’ve created this desire for drama and that dopamine rush and that need for conflict that all of our sites are struggling with.

MJ: Are these types of issues your biggest concern for platforms or do you think issues of privacy or something else is more pressing?

EP: Oh man. It’s changed from what I can see, but Reddit, when I was there, we were super, super oriented around privacy. I haven’t had to deal with as many privacy issues because of that. It was easy. We could say ‘Hey, you’re not allowed to share people’s private addresses’ and you’d get rid of a lot of bad behavior there. That was one of the tools people were using—doxxing. That was an easy way to ban people because we have this strong privacy orientation.

Maybe I’m old school, because the things people share and the amount they share, and the detail they share, is something I would never do. There’s so much harassment and people trying to find me that I’m super protective of where I live and what I do. I don’t want to give people a vector to be able to track me down.

I have a slightly different experience and framework. You have these stories of companies being able to track that you’re pregnant. There are these things you’re sharing that you don’t realize you’re sharing. When you do a DNA test to see who your ancestors are and all of a sudden you’ve shared all of your family’s information, I don’t think people understand that. And on Facebook, you’re not just sharing your data, you’re sharing all your friends’ information, it turns out.

MJ: We’re further out from when you were in the spotlight more with your lawsuit and when you were dealing with a lot of harassment. Do you feel like things have ebbed a little bit or is internet harassment still very pressing for you?

EP: Oh, I think it’s so prevalent. You say anything and people are all over you. And you can’t have a conversation because people shut down so quickly. And I don’t blame them. They want to protect themselves, but it’s really hard to have a open-minded conversation with both sides. I think the worst types of harassment are becoming illegal. The fact that revenge porn is no longer allowed, we started that at Reddit. We did it in February 2015. We were the first major platform to ban revenge porn and unauthorized nude photos and then every other platform followed us and now there’s legislation preventing it. It still happens, but it’s to a lesser extent.

People are now more aware of how harassment actually is hurtful and dangerous. I hope that today, what happened to Leslie Jones wouldn’t happen again, but then you look at what people are allowed to say about the Sandy Hook parents and families. Maybe it hasn’t changed. I don’t know. The general perception is that that type of behavior is terrible. But when the platform CEOs allow it and it doesn’t get taken down… the only thing that causes people to take content down is reporters writing about it. That is not a good system.

MJ: Did you have thoughts on the New Zealand shooter? The influence of the internet ran through his manifesto. Do you think that’s a wakeup call for platforms, or do you think these things will persist on fringe sites like 8chan and 4chan, regardless of what these companies do?

EP: I don’t spend time anymore, thankfully, on 4chan or 8chan. I had to for my work at Reddit because of the cross-platform nature of the harassment. And now there’s all these other platforms that came out when we banned a bunch of subreddits. There’s been this philosophy of “If we allow them to flourish under sunlight, we can convince these people and change their minds, or at least we know what we’re up against.” Actually, that was wrong. We actually help these groups and ideas that are antithetical to our values proliferate. We encourage them and give them the spotlight. And they thrive. That is a complete fallacy in the theory of allowing them to stay on your platforms.

They’re just wrong. We haven’t convinced anybody. If anything, we’ve convinced people to join those movements. It’s interesting because I think people are just wrong and we haven’t admitted that yet.

MJ: Do you think there are different ways companies can address toxicity on their platforms?

EP: It would be interesting for people to treat harassment as a real cost of doing business, and to be tracking and to be transparent about all the different things that happen. All of the wrong decisions that were being made. How do we hold people accountable for the bad things happening on their platform so they actually make an investment? We know they have tons of money. We know they have tons of talent. We know they have tons of experience and data. How do we actually get them to take action and how we do hold them accountable other than reporters, after the fact, calling attention to it?

We need to look at what metrics we should be tracking and maybe that’s a way to regulate. Are there metrics we should be tracking around the hate and around the harassment on these sites that help us know if things are getting better or worse and if companies are investing the right amount to prevent this?

Related Posts:

  • No Related Posts