• 0 Posts
  • 88 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle
  • Hmm. If I’m visualizing this correctly, and depending on the size of the table . . .

    Two pairs of legs with stretchers in between, on pivots that allow them to fold up agains the bottom of the table, slighly offset so that the legs end up alongside each other when folded instead of interfering. If you want them to touch the bottom of the tub, set them up to fold at the “knees” rather than the “thigh”, if you see what I mean. The difficult part is figuring out how to secure them in the extended position. If you’re okay with putting in a couple of bolts whenever you unfold, you could add a couple of supports that link the stretchers to the underside of the table at an angle (pivot at the other end again). Or you could attach a length of wood to one stretcher with a pivot and notch the other end so that when the table is unfolded, it drops over the other stretcher and forms a tight cross half-lap joint.

    All this requires gluing or screwing hinges or bits of wood pierced for dowels or screws to the bottom of the table to form the pivots.




  • Snapchat is not the only problem here, but it is a problem.

    If they can’t guarantee their recommendations are clean, they shouldn’t be offering recommendations. Even to adults. Let people find other accounts to connect to for themselves, or by consulting some third party’s curated list.

    If not offering recommendations destroys Snapchat’s business model, so be it. The world will continue on without them.

    It really is that simple.

    Using buggy code (because all nontrivial code is buggy) to offer recommendations only happens because these companies are cheap and lazy. They need to be forced to take responsibility where it’s appropriate. This does not mean that they should be liable for the identity of posters on their network or the content of individual posts—I agree that expecting them to control that is unrealistic—but all curation algorithms are created by them and are completely under their control. They can provide simple sorts based on data visible to all users, or leave things to spread externally by word of mouth. Anything beyond that should require human verification, because black box algorithms demonstrably do not make good choices.

    It’s the same thing as the recent Air Canada chatbot case: the company is responsible for errors made by its software, to about the same extent as it is responsible for errors made by its employees. If a human working for Snapchat had directed “C.O.” to the paedophile’s account, would you consider Snapchat to be liable (for hiring the kind of person who would do that, if nothing else)?


  • Yes, they should. They chose to deploy the algorithm rather than using a different algorithm, or a human-curated suggestion set, or nothing at all. It’s like a store offering one-per-purchase free bonus items while knowing a few of them are soaked in a contact poison that will make anyone who touches them sick. If your business uses a black box to serve clients, you are liable for the output of that black box, and if you can’t find a black box that doesn’t produce noxious output, then either don’t use one or put a human in the loop. Yes, that human will cost you money. That’s why my suggestion at the end was to use a single common feed, to reduce the labour. If they can’t get enough engagement from a single common feed to support the business, maybe the business should be allowed to die.

    The only leg Snapchat has to stand on here is the fact that “C.O.” was violating their TOS by opening an account when she was under the age of 13, and may well have claimed she was over 18 when she was setting up the account.


  • Bunch of things going on here.

    On the one hand, Snapchat shouldn’t be liable for users’ actions.

    On the other hand, Snapchat absolutely should be liable for its recommendation algorithms’ actions.

    On the third hand, the kid presumably lied to Snapchat in order to get an account in the first place.

    On the fourth hand, the kid’s parents fail at basic parenting in ways that have nothing to do with Snapchat: “If you get messages on-line that make you uncomfortable or are obviously wrong, show them to a trusted adult—it doesn’t have to be us.” “If you must meet someone you know on-line in person, do it in the most public place you can think of—mall food courts during lunch hour are good. You want to make sure that if you scream, lots of people will hear it.” “Don’t ever get into a car alone with someone you don’t know very well.”

    Solution: make suggestion algorithms opt-in only (if they’re useful, people will opt in). Don’t allow known underage individuals to opt in—restrict them to a human-curated “general feed” that’s the same for everyone not opted in if you feel the need to fill in the space in the interface. Get C.O. better parents.

    None of that will happen, of course.


  • Actually, what really matters is not the quality of your code or the disruptiveness of your paradigm, or whether you can outlive the competitors that existed when you started up, but whether you can keep the money coming. The rideshares in particular will fail over time in any country with labour laws that allow drivers to unionize—if the drivers make a sane amount of money, the company’s profits plummet, and investors and shareholders head for the hills. Netflix is falling apart already because the corporations with large libraries of content aren’t so happy to license them anymore, and they’re scrambling to make up the revenue they’ve lost. Google will probably survive only because its real product is the scourge of humanity known as advertising.

    Again, it’s all business considerations, not technical ones. Remember the dot-com boom of the 1990s, or are you not old enough? A lot of what’s going on right now looks like the 2.0 (3.0? 4.0?) release of the same thing. A few of these companies will survive, but more of them will fold, and in some cases their business models will go with them.