You are viewing mulix

Muli Ben-Yehuda's Friends
 
[Most Recent Entries] [Calendar View] [Friends View]

Below are the most recent 25 friends' journal entries.

    [ << Previous 25 ]
    Wednesday, April 16th, 2014
    linuxjournalmx 9:12p
    Non-Linux FOSS: Angry IP

    http://feedproxy.google.com/~r/linuxjournalcom/~3/7jgIJoUyJ_g/non-linux-foss-angry-ip

    The de facto standard for port scanning always has been the venerable Nmap program. The command-line tool is indeed very powerful, but I've only ever seen it work with Linux, and every time I use it, I need to read the man page to figure out the command flags. more>>

    googleresearch 2:08p
    Lens Blur in the new Google Camera app

    http://feedproxy.google.com/~r/blogspot/gJZg/~3/-15hoBp1oZY/lens-blur-in-new-google-camera-app.html

    Posted by Carlos Hernández, Software Engineer

    One of the biggest advantages of SLR cameras over camera phones is the ability to achieve shallow depth of field and bokeh effects. Shallow depth of field makes the object of interest "pop" by bringing the foreground into focus and de-emphasizing the background. Achieving this optical effect has traditionally required a big lens and aperture, and therefore hasn’t been possible using the camera on your mobile phone or tablet.

    That all changes with Lens Blur, a new mode in the Google Camera app. It lets you take a photo with a shallow depth of field using just your Android phone or tablet. Unlike a regular photo, Lens Blur lets you change the point or level of focus after the photo is taken. You can choose to make any object come into focus simply by tapping on it in the image. By changing the depth-of-field slider, you can simulate different aperture sizes, to achieve bokeh effects ranging from subtle to surreal (e.g., tilt-shift). The new image is rendered instantly, allowing you to see your changes in real time.

    Lens Blur replaces the need for a large optical system with algorithms that simulate a larger lens and aperture. Instead of capturing a single photo, you move the camera in an upward sweep to capture a whole series of frames. From these photos, Lens Blur uses computer vision algorithms to create a 3D model of the world, estimating the depth (distance) to every point in the scene. Here’s an example -- on the left is a raw input photo, in the middle is a “depth map” where darker things are close and lighter things are far away, and on the right is the result blurred by distance:

    Here’s how we do it. First, we pick out visual features in the scene and track them over time, across the series of images. Using computer vision algorithms known as Structure-from-Motion (SfM) and bundle adjustment, we compute the camera’s 3D position and orientation and the 3D positions of all those image features throughout the series.

    Once we’ve got the 3D pose of each photo, we compute the depth of each pixel in the reference photo using Multi-View Stereo (MVS) algorithms. MVS works the way human stereo vision does: given the location of the same object in two different images, we can triangulate the 3D position of the object and compute the distance to it. How do we figure out which pixel in one image corresponds to a pixel in another image? MVS measures how similar they are -- on mobile devices, one particularly simple and efficient way is computing the Sum of Absolute Differences (SAD) of the RGB colors of the two pixels.

    Now it’s an optimization problem: we try to build a depth map where all the corresponding pixels are most similar to each other. But that’s typically not a well-posed optimization problem -- you can get the same similarity score for different depth maps. To address this ambiguity, the optimization also incorporates assumptions about the 3D geometry of a scene, called a "prior,” that favors reasonable solutions. For example, you can often assume two pixels near each other are at a similar depth. Finally, we use Markov Random Field inference methods to solve the optimization problem.

    Having computed the depth map, we can re-render the photo, blurring pixels by differing amounts depending on the pixel’s depth, aperture and location relative to the focal plane. The focal plane determines which pixels to blur, with the amount of blur increasing proportionally with the distance of each pixel to that focal plane. This is all achieved by simulating a physical lens using the thin lens approximation.

    The algorithms used to create the 3D photo run entirely on the mobile device, and are closely related to the computer vision algorithms used in 3D mapping features like Google Maps Photo Tours and Google Earth. We hope you have fun with your bokeh experiments!
    michaelswanwick 7:22a
    Make Your Darlings Suffer

    http://floggingbabel.blogspot.com/2014/04/make-your-darlings-suffer.html

    .



    It's too early to know exactly what influence George R. R. Martin's A Song of Ice and Fire will have on the fantasy genre, though it's a safe bet that it will be significant.  But it's pretty obvious that the Red Wedding scene all by itself will have a significant impact -- and a good one, too.

    One of the signature weaknesses of a new writer is a tendency to be too nice to one's characters.  Some weaknesses, such as a propensity to spend five to ten pages of a story "setting the scene" before finally getting around to  the plot, can be cured simply by clearly explaining why they're a bad idea and how they can be circumvented.  But when all of one's upbringing is devoted into turning one into a decent person, it can be hard to undo.  "Look, I'll say to my students, on those occasions when I teach.  "It would be a heinous act to throw a woman into the path of an oncoming train.  But we celebrate Tolstoy for doing so in Anna Karenina.  These are not real people we're dealing with here.  They're only words on paper.  Make those bastards suffer!"

    They hear but, half in love with their own creations, they do not easily believe.

    There's a lot to admire about the Red Wedding, including the fact that it took the readers and later viewers by surprise.  I'm sure there are many new writers out there at this very moment feverishly plotting out their own massacres in imitation.  And that's good, because while most of those bloodlettings are destined for the drawer, they're a positive step toward publication.  Many more writers are taking to heart George's exemplary willingness to kill off characters who've won the readers' affections.  That's also good.  But the chief lesson to be learned hers is to let your darlings suffer.

    Why is this desirable?  Because there are things we must learn in life which can only be learned through suffering.  If that suffering is experienced only in our imaginations, so much the better.

    Also, it can be wonderfully entertaining.

    The opening of the Honest Trailers spoof of Game of Thrones begins "From fiction's most notorious serial killer..."  But let's be honest here.  It should be "From fiction's most beloved serial killer..."  I trust that any new writers reading this are taking the implicit moral to heart.


    *
    xkcd_rss 4:00a
    philg 2:42a
    Passover Tax Day thoughts

    http://blogs.law.harvard.edu/philg/2014/04/15/passover-tax-day-thoughts/

    http://blogs.law.harvard.edu/philg/?p=5836

    Why is this Passover different from most others? The first day coincides with Tax Day for Americans. Jews remember the bitter hardship and forced labor of slavery in Egypt. Scholars, however, can find no evidence of modern era-style slavery (or of Jews residing in or escaping from in Egypt). As for the pyramids there are records of payments to laborers, typically farmers who had nothing else to do at certain times of year. A “slave” in ancient Egypt may simply have been a person subject to a 20% tax that free Egyptians did not pay.

    [To the children at last night's Seder I pointed out that "the plagues visited upon the Egyptians were so bad that their civilization lasted only about 2000 more years."]

    Tuesday, April 15th, 2014
    linuxjournalmx 7:28p
    Encrypting Your Cat Photos

    http://feedproxy.google.com/~r/linuxjournalcom/~3/JEuv4Mr1_5E/encrypting-your-cat-photos

    The truth is, I really don't have anything on my hard drive that I would be upset over someone seeing. I have some cat photos. I have a few text files with ideas for future books and/or short stories, and a couple half-written starts to NaNoWriMo novels. It would be easy to say that there's no point encrypting my hard drive, because I have nothing to hide. more>>

    bunnyhuangblog 7:03p
    Myriad RF for Novena

    http://www.bunniestudios.com/blog/?p=3727

    This is so cool. Myriad-RF has created a port of their wideband software defined radio to Novena (read more at their blog). Currently, it’s just CAD files, but if there’s enough interest in SDR on Novena, they may do a production run.

    The board above is based on the Myriad-RF 1. It is a fully configurable RF board that covers all commonly used communication frequencies, including LTE, CDMA, TD-CDMA, W-CDMA, WiMAX, 2G and many more. Their Novena variant plugs right into our existing high speed expansion slot — through pure coincidence both projects chose the same physical connector format, so they had to move a few traces and add a few components to make their reference design fully inter-operable with our Novena design. Their design (and the docs for the transceiver IC) is also fully open source, and in fact they’ve one-upped us because they use an open tool (KiCad) to design their boards.

    I can’t tell you how excited I am to see this. One of our major goals in doing a crowdfunding campaign around Novena is to raise community awareness of the platform and to grow the i.MX6 ecosystem. We can’t do everything we want to do with the platform by ourselves, and we need the help of other talented developers, like those at Myriad-RF, to unlock the full potential of Novena.

    feld_feed 5:23p
    Today’s Fun – Gnip, Twitter, Uncommon Stock, and Pre-Seed Rounds

    http://feedproxy.google.com/~r/FeldThoughts/~3/xLXygqvS7gY/todays-fun-gnip-twitter-uncommon-stock-pre-seed-rounds.html

    http://www.feld.com/wp/?p=10130

    FSA (Feld Service Announcement) – my version of a “public service announcement”: Moz is on the hunt for a VP of UX and Design. This role is one of our most crucial hires this year. The ideal candidate will come to us with experience and examples to show of very complex, technical projects that s/he made simple and fun. I would love for you to share this job description with your network or if you have anyone in mind I would love for you to send them our way.

    Yeah, it’s been kind of busy the last week. Congrats to my friends at Gnip on becoming part of the Twitter flock. I have a great origin story about the founding of Gnip and the first few years for some point in the future. But for now, I’m just going to say to everyone involved “y’all are awesome.”

    Last week Manu Kumar had a spectacular post titled The New Venture Landscape. While it’s bay area centric, I especially agree with the punch line:

    Pre-Seed is the new Seed. (~$500K used for building team and initial product/prototype)
    Seed is the new Series A. (~$2M used get for building product, establishing product-market fit and early revenue)
    Series A is the new Series B. (~6M-$15M used to scale customer acquisition and revenue)
    Series B is the new Series C.
    Series C/D is the new Mezzanine

    Today at 5pm I’m doing a fireside chat with Eliot Peper, the author of Uncommon Stock, the first book published by FG Press. Join us for some virtual fun and a discussion about fiction, books, and startups.

    And – if you miss that, Eliot is doing another event on Friday at 5pm at Spark Boulder.

    The post Today’s Fun – Gnip, Twitter, Uncommon Stock, and Pre-Seed Rounds appeared first on Feld Thoughts.

    yodaikenfsmfeed 11:05a
    Economics of Free Software

    http://vyodaiken.com/2014/04/15/economics-of-free-software/

    http://vyodaiken.com/?p=1678

    Fate has made me the “money guy” for OpenSSL so I’m going to talk about that for a bit.

    As has been well reported in the news of late, the OpenSSL Software Foundation (OSF) is a legal entity created to hustle money in support of OpenSSL. By “hustle” I mean exactly that: raising revenue by any and all means[1]. OSF typically receives about US$2000 a year in outright donations and sells commercial software support contracts[2] and does both hourly rate and fixed price “work-for-hire” consulting as shown on the OSF web site. The media have noted that in the five years since it was created OSF has never taken in over $1 million in gross revenues annually.

    Thanks to that publicity there has been an outpouring of grassroots support from the OpenSSL user community, roughly two hundred donations this past week[3] along with many messages of support and encouragement[4]. Most were for $5 or $10 and, judging from the E-mail addresses and names, were from all around the world. I haven’t finished entering all of them to get an exact total, but all those donations together come to about US$9,000.

    OpenSSL uses a “give away code and charge for consulting” model that FSMLabs began with in 1999. We couldn’t make it work either.

     

    Monday, April 14th, 2014
    daniellemirefd 11:16p
    Science: ideals vs. reality

    http://feedproxy.google.com/~r/daniel-lemire/atom/~3/UNnSalI0xbQ/

    http://lemire.me/blog/?p=6294

    Fernando Pérez gave a talk at Pycon 2014 with a brilliant slide:

    The ideals reality of science:

    • The pursuit of verifiable answers highly cited papers for your c.v.
    • The validation of our results by reproduction convincing referees who did not see your code or data
    • An altruistic, collective enterprise A race to outrun your colleagues in front of the giant bear of grant funding

    Credit: Bill Tozier for the pointer.

    daniellemirefd 5:34p
    The financial value of open source software

    http://feedproxy.google.com/~r/daniel-lemire/atom/~3/uRJoRJxFO70/

    http://lemire.me/blog/?p=6254

    We all rely daily on free and open source software, whether we know it or not. The entire Internet is held together by open source software. The cheap router that powers you Wifi network at home uses the Linux kernel. Your android phone is based on the Linux kernel. Google servers run Linux. In 2014, almost everyone is a Linux user.

    For most people, the financial value of this software is an abstract concept. I think that most people assume that open source software must be cheap.

    On the contrary, producing quality open source software is tremendously expensive. And the financial investment grows every year.

    How much did it cost to write the millions of lines of the Linux kernel? García-García and de Magdaleno estimated the cost of the Linux kernel, as of 2010, to 1.2 billion euros. That is how much it would cost of any one company to redo the Linux kernel from scratch.

    You might assume that programmers working on the Linux kernel are hopeless nerds who live in their parent’s basement. In fact, most of them are highly qualified engineers earning 6-figure salaries or better. So the financial estimate represents real money. It is not a virtual cost.

    Of course, the Linux kernel is a tiny fraction of all the open source software we rely upon. Most open source developers will never contribute to the Linux kernel: it is reserved for a small elite. According to the Linux foundation, the cost of building a standard Linux distribution (in 2008) would have been over $10 billion.

    So what is the value of all open source software beyond Linux?

    It helps to realize that software is a huge business. In Europe, companies and governments spend over 200 billion euros a year building software. To put this in perspective, the movie industry in the US generates about 10 billion dollars in revenues. In the United States, 1 out of every 200 workers is a software engineer. A very sizeable fraction of all “engineering” today is in software.

    Of course, not all of the software is open source. Still, Daffara estimates the financial value of open source software, for Europe alone, to over 100 billion euros a year.

    So why don’t we have more open source drug designs, movie content, textbooks, and so on?

    The common argument is that nobody will be willing to invest, in say, a new textbook, a new drug or a new show if anyone can copy and redistribute it for free—the investment is too large.

    But I think that the real difference is cultural. In the software world, entire businesses grew surrounded by open source software. They learned to thrive with and through open source software. Companies that entirely reject open source are at a competitive disadvantage. The same happened in the fashion industry. Designers assume that other people will copy them. In fact, designers hope others will copy them.

    Other industries, like the pharmaceutical or education industry, have internalized the patent and copyright systems. That is why college students have to pay over $100 for a typical textbook whereas they can get an operating system that costed billions to make for free.

    I think that if we had had a world where it is fair game to copy and distribute a textbook for free, we would still have textbooks. I think they would still be excellent. I also think that textbook authors would get paid, just like the programmers do.

    Would the overall result be better? I do not know but it is fascinating to imagine what such a parallel universe might look like.

    Credit: Thanks to Christopher Smith for useful pointers.

    kuro5hin 3:19p
    URGENT HELP NEEDED

    http://www.kuro5hin.org/story/2014/4/12/151527/506

    Hello, I am Public Barr. Neil Cane, an attorney of law to a deceased Process Architect, who was based in Salmon Creek, Washington, also referred to as my client. On the 29th of March 2014, my client was arrested by corrupt Government Officials and charged with malicious mischief. He was killed in prison under suspicious circumstances and died due to rectal bleeding following sexual impalement.
    michaelswanwick 5:56a
    Radiant Doors . . . the Series?

    http://floggingbabel.blogspot.com/2014/04/radiant-doors-series.html

    .



    It's too early to break out the champagne yet, but the cable network WGN America has given what's called "a script order against a series commitment" to a television series based on my story "Radiant Doors."

    What this means is that if the network likes the script (now being written by Jeremy Doner), the series will be made.  Justin Lin, the director of Fast and Furious 6, will be the director and executive producer if and when Radiant Doors is made.

    "Radiant Doors" is the single darkest story I've ever written -- and that's saying something.  The premise is that one day radiant doors open in the air everywhere in the world and through them pour millions of refugees.  They've all been terribly abused.  And they're from our future.

    I don't know anything about Justin Lin's vision for the series, and that's probably just as well.  Neither he nor Doner needs me peering over their shoulders, second-guessing them.  But in addition to the obvious benefits to me if the series is ever made, I'd love to see just what they do with the premise.

    You can read all about it here.


    Above:  Justin Lin

    *
    xkcd_rss 4:00a
    computcomplexit 2:40a
    Factorization in coNP- in other domains?

    http://blog.computationalcomplexity.org/2014/04/factorization-in-conp-in-other-domains.html

    I had on an exam in my grad complexity course to show that the following set is in coNP

    FACT = { (n,m) : there is a factor y of n with 2 \le y \le m }

    The answer I was looking for was to write FACTbar (the complement) as

    FACTbar = { (n,m) | (\exists p_1,...,p_L) where L \le log n
    for all i \le L we have m < p_i \le n and p_i is prime (the p_i are not necc distinct)
    n =p_1 p_2 ... p_L
    }
    INTUITION: Find the unique factorization and note that the none of the primes are < m
    To prove this work you seem to need to use the Unique Factorization theorem and you need
    that PRIMES is in NP (the fact that its in P does not help).

    A student who I will call Jesse (since that is his name) didn't think to complement the set  so instead he wrote the following CORRECT answer

    FACT = { (n,m) | n is NOT PRIME and forall p_1,p_2,...,p_L  where 2\le L\le log n
    for all i \le L,  m< p_i \le n-1 , (p_i prime but not necc distinct).
    n \ne p_1 p_2 ... p_L
    }
    (I doubt this proof that FACT is in coNP is new.)
    INTUITION: show that all possible ways to multiply together numbers larger than m do not yield n,
    hence n must have a factor \le m.

    Here is what strikes me- Jesse's proof does not seem to use Unique Factorization.  Hence it can be used in other domains(?). Even those that do not have Unique Factorization (e.g. Z[\sqrt{-5}]. Let D= Z[\alpha_1,...,\alpha_k] where the alpha_i are algebraic. If n\in D then let N(n) be the absolute value of the sum of the coefficients (we might want to use the product of n with all of its conjugates instead, but lets not for now).

    FACT = { (n,m) : n\in D, m\in NATURALS, there is a factor y in D of n with 2 \le N(y) \le m}

    Is this in NP? Not obvious (to me) --- how many such y's are there.

    Is this the set we care about? That is, if we knew this set is in P would factoring be in P? Not obv (to me).

    I suspect FACT is in NP, though perhaps with a diff definition of N( ). What about FACTbar?
    I think Jesse's approach works there, though might need  diff bound then log L.

    I am (CLEARLY) not an expert here and I suspect a lot of this is known, so my real point is
    that a students diff answer then you had in mind can be inspiring. And in fact I am inspired to
    read Factorization: Unique and Otherwise by Weintraub which is one of many books I've been
    meaning to read for a while.
    philg 12:41a
    Hugo Chavez’s legacy

    http://blogs.law.harvard.edu/philg/2014/04/13/hugo-chavezs-legacy/

    http://blogs.law.harvard.edu/philg/?p=5834

    I’m reading Comandante: Hugo Chávez’s Venezuela. The book contains an interesting perspective on this leader, from an executive at the Venezuelan national oil company:

    Sansó defended Chávez’s energy policy, saying the comandante had helped revive OPEC, sending prices rising even before the Iraq war, and had had the vision to recognize Venezuela’s oil was not just around Lake Maracaibo, in the west, but also in the center of the country along the Orinoco in a smiling arc known as the Faja. The same wilderness that had swallowed gold-seeking conquistadores contained enormous deposits of extra-heavy crude. The black ooze had long been written off as tar, a costly-to-extract type of liquid coal, and the old PDVSA gave foreign oil companies a virtual free hand to develop it. Chávez insisted it was oil, and eventually even the U.S. Geological Survey agreed. The zone contained an estimated 220 billion barrels—making Venezuela’s total reserves vaster than Saudi Arabia’s. Chávez partly nationalized the Faja in 2007, taking majority shares in the operations, an audacious decision that infuriated the foreign oil companies working there. “For that alone Chávez was worth it,” said Sansó. “He was crazy enough to do it. Any reasonable guy wouldn’t have had the guts. He would have said it’s not possible. A century from now Chávez will be remembered and thanked for this, no matter what else happens.”

    The book could use some editing. It jumps back and forth in time. There is some redundancy. But it is highly relevant right now when politicians and newspapers worldwide are trying to get people excited about “income inequality” (e.g., see this New York Times op-ed from yesterday). Chávez did not just talk about income inequality but took action.

    Sunday, April 13th, 2014
    rands_in_repose 7:35p
    Protecting Yourself from Heartbleed

    http://mashable.com/2014/04/09/heartbleed-bug-websites-affected/?utm_cid=mash-com-fb-main-link

    http://randsinrepose.com/?p=1470

    Earlier this morning, I tweeted:

    This is not actually good advice. You shouldn’t be changing your password on a server until the server administrator has confirmed whether their servers were affected and, if so, whether the server has been patched.

    Mashable appears has an up-to-date breakdown of the most popular services out there and their disposition relative to Heartbleed.

    #

    rmlove 4:01p
    The End-of-Life of Windows XP and SSL/TLS Configurations

    http://feeds.rlove.org/~r/rlove/~3/6AsgFBc3c9A/the-end-of-life-of-windows-xp-and.html

    This is a followup to my previous post, Strong SSL/TLS Cryptography in Apache and Nginx.

    Perhaps hard to tell given how many users remain, but Windows XP reached its end of life on 8 April 2014. This means no more support, updates, or bug fixes—not even of critical security flaws. Windows XP use has been dwindling, but its end-of-life provides an excellent opportunity to consider removing support for it from your applications and websites.

    Dropping Windows XP support provides particularly interesting results for SSL/TLS configurations, as most of the compromises one makes in their provided cipher suites are in support of old versions of Internet Explorer on Windows XP. Since those users are now even more of a walking botnet and malware infestation, we needn't continue to support them to the detriment of the rest of the Internet.

    And what changes can we make? In my previous cryptography guide, I advocate disabling SSLv3 support, which breaks Internet Explorer 6 on Windows XP, but prevents a downgrade attack for everyone else. If we're willing to drop support for all versions of Internet Explorer on Windows XP, we can accomplish two other goals:

    • Only support Perfect Forward Secrecy, offering no cipher suites without forward security.
    • Only support modern ciphers. Currently this just means AES (in both CBC and GCM mode) but in the future will include ChaCha20+Poly1305.

    To make these changes, follow my previous guide but use this cipher suite ordering for Apache:

    SSLCipherSuite ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA:ECDHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES128-SHA256:DHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA:AES256-GCM-SHA384:AES128-GCM-SHA256:AES256-SHA256:AES128-SHA256:AES256-SHA:AES128-SHA:!aNULL:!eNULL:!EXPORT:!CAMELLIA:!3DES:!DES:!MD5:!PSK:!RC4:!RSA

    SSLHonorCipherOrder on

    And this cipher suite ordering for Nginx:

    ssl_ciphers 'ECDHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA256:ECDHE-RSA-AES256-SHA:ECDHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES128-SHA256:DHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA:AES256-GCM-SHA384:AES128-GCM-SHA256:AES256-SHA256:AES128-SHA256:AES256-SHA:AES128-SHA:!aNULL:!eNULL:!EXPORT:!CAMELLIA:!3DES:!DES:!MD5:!PSK:!RC4:!RSA';

    ssl_prefer_server_ciphers on;

    With the current version of OpenSSL, this yields the following ciphers, in descending order of preference:

    TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 (0xc030)
    TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 (0xc02f)
    TLS_DHE_RSA_WITH_AES_256_GCM_SHA384 (0x9f)
    TLS_DHE_RSA_WITH_AES_128_GCM_SHA256 (0x9e)
    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 (0xc028)
    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 (0xc027)
    TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA (0xc014)
    TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA (0xc013)
    TLS_DHE_RSA_WITH_AES_256_CBC_SHA256 (0x6b)
    TLS_DHE_RSA_WITH_AES_128_CBC_SHA256 (0x67)
    TLS_DHE_RSA_WITH_AES_256_CBC_SHA (0x39)
    TLS_DHE_RSA_WITH_AES_128_CBC_SHA (0x33)

    This is a small, focused list, with absolutely no compromises for security, obeying the following rules:

    • Only support PFS. We favor ECDHE over DHE as the former is less resource intensive, but we support both.
    • Only support modern ciphers, which currently is just AES-CBC and AES-GCM. We favor GCM mode over CBC mode as the former is more efficient and not susceptible to the BEAST attack.
    • Favor 256-bit key size over 128 but support nothing smaller.
    • Support SHA-2 and SHA, nothing else. Prefer SHA-2 over SHA. For SHA-2, prefer 384-bit digests over 256-bit.

    With this cipher suite ordering, Chrome and Firefox will both use TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256—a mighty fine choice—but even your least-favored cipher, TLS_DHE_RSA_WITH_AES_128_CBC_SHA provides forward security and a strong cipher.

    For all your hard effort, this will earn you an "A+" grade and near-perfect SSL Labs Rating:

    SSL Labs A+ Grade for rlove.org

    As before, you cannot do better without silly compromises, such as only supporting TLS 1.2, which would earn you a 100 in "Protocol Support," but then only Chrome and Firefox 27 could access your site.


    Which likely just means the addition of IE 7 and 8.

    Indeed, I'm not thrilled to recommend only one cipher. Even if AES were perfect, we ought to have choice. I believe ChaCha20+Poly1305 is an excellent alternative. It is currently supported by Chrome but is not yet in OpenSSL. Once in the latter I will update my recommendations.

    isra12998 9:00p
    מייקל שומכר בימים קדמוניים

    http://israblog.nana10.co.il/blogread.asp?blog=12998&blogcode=14087282

    ובימים הרחוקים ההם, מייקל ידידנו עול ימים היה ומחלפות שערו ארוכות היו.נער היה ומצחק היה עם רעיו.
    ובימים רחוקים אלו, נמנה היה מייקל על חבר מרעיו של שר צבאות אחד אשרמלמד היה את בני התשחורת את מלחמתם של גוג ומגוג, גוליית הפלישתי ואתמעשי העוז של שמשון בן מנוח.תלמיד חכם היה מייקל ואת תלמודי המלחמה עם חבר מרעיו עשה.
    שקט, עניו ושפל רוח היה מייקל בכל מעשי המלחמה אשר בהם את חלקו נטל.שקט היה במלחמתו.

    אך כאשר ירדה רוח רפאים מן ההרים, הייתה סערתהשחל הזועם פורצת לה מתוך נפשו השקטה והענווה של מייקל, אש וגופריתרשפו מעיניו וכל אויביו הוכו ביד רמה, בזרוע נטויה ובאבני קלע אשר הוטלובאון מזרועותיו המסוקסות של מייקל הזועם בדרכים.ובימים סוערים אלה, יכול היה רק שר הצבאות לדבר על לבו של מייקל ולהשיבאת רוח הרפאים אל תוך נפשו הענווה של מייקל.

    ויהי באחד הימים וכרוז ממרחקים הגיע וכה אמר הכרוז:כל החזק מבין חבר המרעים יתברך במעשיו בציפור הברזל למרחקים יצא,רחוק אפילו ממסעו של יונה במעי הדג ויגיע לארצם של הכותים הלבנים בעליהמחלפות הלבנות.שם, בארץ רחוקה זו, ישחקו הנערים לפני השועים ואותם משפט צדק ידונוהזקנים.
    Sunday, April 6th, 2014
    rands_in_repose 4:55p
    If You Happen to Be Building a New Operating System…

    http://www.theverge.com/2014/4/5/5585216/team-behind-webos-releases-mochi-redesign-open-source

    http://randsinrepose.com/?p=1441

    You could do a lot worse than the design for webOS. The team recently released a wiki full of documents, design assets, and working samples to the open source community.

    Gorgeous examples of flat design.

    #

    rands_in_repose 7:02p
    Presentation Design Joy

    http://randsinrepose.com/links/2014/04/06/presentation-design-joy/

    http://randsinrepose.com/?p=1446

    As appears to be tradition now with the iWork suite of applications, Apple is slowly updating the applications to both address minor issues as well as introduce functionality that was removed in the most recent major update.

    As is now custom, I keep the old version of Keynote around to compare and contrast feature set because while Apple’s “What’s New in Keynote” is useful, it often neglects to mention interesting changes to functionality and design.

    The headline is: nothing earth shattering has landed in Keynote 6.2 that is going to affect my presentation design workflow. To determine this, I compared toolbars, preferences, inspectors, and menu bars between Keynote 6.1 and 6.2. It’s not an exhaustive comparison, but this is where I tend to spend my time and any improvement has potential to increase my presentation design joy.

    So, yes, the toolbar is updated. Keynote 6.1′s toolbar is on the top, Keynote 6.2 is on the bottom – click to see a larger version:

    keynote toolbars

    This adds a button I don’t need – add a slide – because I’m a keyboard guy and Cmd-Shift-N works great. They’ve also changed the Setup “inspector” to Document which makes sense in my head. These palettes remain frustratingly docked in the main window. As I’ve written about before, I’m uncertain if this is usability improvement, but I’m about to enter a presentation heavy lifestyle over the next three months. I’ll have a better sense of the use of these embedded palettes.

    Preferences were mostly unchanged. They added the ability to show slide layout names which I have not figured out. I can display ruler units as a percentage. Ok. Great?

    Animations received love with the addition of new transitions and builds. They also added motion blur to the animations which is is a slick visual flourish you’ll never actually see, but will appreciate. Magic Move adds text morphing which means it will continue to be one of my go to animations as my presentations tend to be text focused1. Magic Move is still baffling to set-up and remains fragile as it relies on multiple slides to be… just right, but I’m happy to see it’s evolution.

    Presentation view, I believe, remains functionally equivalent to the prior version, but did receive design love.

    keynote-design-love

    In both practice and play mode2, the presentation view now shows you when you’re ready to proceed with a clear green bar across the top of the view. When an animation is running, this bar is red which is handy. All of the buttons at the top of the window have been increased in size, altered in color, and have better placement to make your presentation practicing easier. Lastly and most importantly, while you still can not perform free form layout of the presentation view, Keynote does allow you to change the style of the presentation notes on a per slide slide. I’m not sure when this handy feature landed, it wasn’t Keynote 6.2. You still can’t change the presentation notes style at the master slide level which would be convenient and efficient at making sure that presentation notes are optimally sized while in the presenter view.

    According to the What’s New update provided by Apple, there are many other new features: Alpha image editing, media browser improvements, custom data formats, improved AppleScript support, support for animated GIFS (yay?) and others. Again, nothing earth shattering, it’s a house cleaning release and it’s going to take a few weeks of regular use to see if they’ve increased my presentation design joy.


    1. EDIT: I originally thought this was the returning of the same old text transforms. I was wrong
    2. EDIT: There was some stop/advance visual cue work in Keynote 6.1 in real presentation mode, but it appears they flushed it out and finished it in 6.2. 

    #

    Saturday, April 12th, 2014
    liwblogfd 2:25p
    Programming performance art

    http://blog.liw.fi/posts/hacking-performance/

    Thirty years ago I started to learn programming. To celebrate this, I'm doing a bit of programming as a sort of performance art. I will write a new program, from scratch, until it is ready for me to start using it for real. The program won't be finished, but it will be ready for my own production use. It'll be something I have wanted to have for a while, but I'm not saying beforehand what it will be. For me, the end result is interesting; for you, the interesting part is watching me be stupid and make funny mistakes.

    The performance starts Friday, 18 April 2014, at 09:00 UTC. I apologise if this is an awkward time for you. No time is good for everyone, so I picked a time that is good for me.

    Run the following command to see what the local time will be for you.

    date --date '2014-04-18 09:00:00 UTC'
    

    While I write this program, I will broadcast my terminal to the Internet for anyone to see. For instructions, see the http://liw.fi/distix/performance-art/ page.

    There will be an IRC channel as well: #distix on the OFTC network (irc.oftc.net). Feel free to join there if you want to provide real time feedback (the laugh track).

    Friday, April 11th, 2014
    daniellemirefd 7:48p
    Probabilities and the C++ standard

    http://feedproxy.google.com/~r/daniel-lemire/atom/~3/cV93Vdie2vY/

    http://lemire.me/blog/?p=6266

    The new C++ standard introduced hash functions and hash tables in the language (as “unordered maps”).

    As every good programmer should know, hash tables only work well if collisions between keys are rare. That is, if you have two distinct keys k1 and k2, you want their hash values h(k1) and h(k2) to differ most of the time.

    The C++ standard does not tell us how the keys are hashed but it gives us two rules:

    • The value returned by h(k) shall depend only on the argument k.
    • For two different values k1 and k2, the probability that h(k1) and h(k2) “compare equal” (sic) should be very small.

    The first rule says that h(k) must be deterministic. This is in contrast with languages like Java where the hash value can depend on a random number if you want (as long as the value remains the same through throughout the execution of a given program).

    It is a reasonable rule. It means that if you are iterating through the keys of an “unordered set”, you will always visit the keys in the same order… no matter how many times you run your program.

    It also means, unfortunately, that if you find two values such that h(k1) and h(k2), then they will always be equal, for every program and every execution of said programs.

    The second rule is less reasonable. We have that h(k1) and h(k2) are constant values that are always the same. There is no random model involved. Yet, somehow, we want that the probability that they will be the same be low.

    I am guessing that they mean that if you pick k1 and k2 randomly, the probability that they will hash to the same value is low, but I am not sure. If it is what they mean, then it is a very weak requirement: a vendor could simply hash strings down to their first character. That is a terrible hash function!

    I am under the impression that the next revision of the C++ standard will fix this issue by following in Java’s footstep and allow hash functions to vary from one run of a program to another. That is, C++ will embrace random hashing. This will help us build safer software.

    linuxjournalmx 6:25p
    Numerical Python

    http://feedproxy.google.com/~r/linuxjournalcom/~3/BpYrwe_q9Uw/numerical-python

    For the past few months, I've been covering different software packages for scientific computations. For my next several articles, I'm going to be focusing on using Python to come up with your own algorithms for your scientific problems. more>>

    feld_feed 3:43p
    Massachusetts Has An Innovative Approach To Immigration Reform

    http://feedproxy.google.com/~r/FeldThoughts/~3/BTniRpQBK-o/massachusetts-innovative-approach-immigration-reform.html

    http://www.feld.com/wp/?p=10127

    Two big proposals from Massachusetts Governor Deval Patrick today. First, he’s proposing to ban non-competition agreements. He’s also proposing an incredibly clever and innovative approach to immigration reform applicable only to Massachusetts.

    I lived in the Boston-area for twelve years (Cambridge for four years and Boston for eight years. ) Even though I often say that was 11 years and 364 days too many for my “non-big city, non-east coast” personality, Boston still has a sweet spot in my heart. I had an amazing (and often excruciating) experience at MIT which was foundational to my personality, thought process, and character. I started and sold my first company there (first office – 875 Main Street, Cambridge; last office 1 Liberty Square, Boston). Techstars Boston was the first geographic expansion for Techstars. I’m not a sports fan but I always root for the Red Sox. I think I have more close friends in the VC business in Boston than in the Bay Area. Two of my closest friends – Will Herman and Warren Katz – both live there. And I know my way around downtown Boston – even after the Big Dig – better than any other downtown in the world.

    The Massachusetts non-competition situation has always been stupid. In 2009, my partners and I at Foundry Group joined a coalition of VCs to try to eliminate non-competition agreements in MA. It’s awesome to see Governor Patrick take action on it since it’s one of the major inhibitors of the MA entrepreneurial scene.

    The immigration report proposal is even more fascinating. It’s a great example of creative and innovation public-private policy at the state level to encourage and enhance entrepreneurship. Jeff Bussgang from Flybridge explains it succinctly in his post so I’ll just repeat it here.

    “The idea is a simple one:  create a private-public partnership to allow international entrepreneurs to come to Boston and be exempt from the restrictive H-1B visa cap.  How is it possible to do this?  The US Citizenship and Immigration Services Department (USCIS) has a provision that allows universities to have an exemption to the H-1B visa cap.  Governor Deval Patrick announced today that the Commonwealth of Massachusetts will work in partnership with UMass to sponsor international entrepreneurs to be exempt from that cap, funding the program with state money to kick start what we anticipate will be a wave of private sector support.” 

    Brilliant. As our federal government continues to struggle to make any real progress on immigration reform, I love to see it happening at the state level. In addition to being good for innovation, it’s the kind of thing that dramatically differentiates states from one another on a policy, business, and innovation dimension that actually matters and likely has significant long term positive economic impacts on the region.

    Governor Patrick – kudos to you. Governor Hickenlooper – I encourage you to roll out exactly the same thing in the State of Colorado. I know exactly the people at CU who would be happy to lead this, as would I. And since one of our Senators (Michael Bennet) is leading the immigration reform effort in the US Senate and our other Senator (Udall) has been a strong supporter of the Startup Visa and immigration report from the first discussion about it in 2009, I expect you already know your broad constituents support it.

    Oh – and to my friends in NY who have been helping on the immigration reform front, let’s crank this up in NY also! Why should MA have all the fun?

    The post Massachusetts Has An Innovative Approach To Immigration Reform appeared first on Feld Thoughts.

    [ << Previous 25 ]
My Website   About LiveJournal.com