Looking through the Android 11 features and they seem like a decent improvement. The thing that stuck out to me was how at the end they have a link to “pixel exclusive features” after reading the list it didn’t seem like a single thing on the list was linked to specific hardware requirements. Is google really moving features out of android AOSP to make their phones look better?
I couldn’t find that link in the page (Ctrl-F, ‘pixel’), would you mind sharing that URL? I’m curious to see what’s on it.
I think the URL changed from a more consumer/PR-focused blog post to the current developer-focused URL.
would you mind sharing that URL?
This is the list of “Pixel first” features. https://www.blog.google/products/pixel/android11-exclusive-pixel-features and they specifically say “And this time, with new Pixel-first features on Android 11, your Pixel has even more smarts to make it better and more helpful…” The Pixel support thread (https://support.google.com/pixelphone/thread/69861931?hl=en) similarly sells these new features from an (as I would term it) advertising perspective.
Is google really moving features out of android AOSP to make their phones look better?
Not to sound jaded, but why would this be a surprise? Or to rephrase that, if Google has teams dedicated to Pixel support, why would they force those teams to put everything developed into AOSP? As a company, they have to find some way/reason to market their products and I wouldn’t expect them to leave the Pixel stock.
This is like the old Arthur C. Clarke story “The Nine Billion Names Of God”, wherein Tibetan monks use a computer to accelerate their sect’s goal of enumerating every possible name in their language. As they finish, the Universe ends. Oops.
That and unsong which features a kabbalistic sweatshop trying to brute-force names of god.
Company: Foxconn (Wisconsin)
Company site: https://foxconnjobs.us/
Position(s): Industrial Software Engineering, AI/ML Software Engineering, Content and Marketing, among others
Location: Southeastern Wisconsin
Description: Foxconn is the largest contract electronics manufacturer in the world. We manufacture most apple products, graphics cards, HPC, and game consoles – over 40% of all consumer electronics sold worldwide.
Foxconn has entered Wisconsin and is in the process of building a science and technology park between Milwaukee and Chicago. The plan is to leverage Foxconn’s extensive manufacturing experience, manufacturing data, and HPC manufacturing capability to achieve smart, worry-free factory operations.
As for your professional development, as the 5th largest private employer in the world with global factory operations, there’s a lot of room for experienced and passionate people to learn, especially in the fields of industrial AI, factory IoT, AI sales, or electrical engineering. I encourage you to check out the job site and see if there’s a position that interests you. Foxconn is definitely not a traditional startup company, but it provides unique opportunities to learn about smart manufacturing.
Contact: Please send me a PM on this site or apply directly on the company site. I’d be happy to answer questions.
I love this, and I meta-love thinking about the reaction I’d get from co-workers were I to do this. I’d feel like Dale Cooper taking the Twin Peaks Sherrif’s Department on a rock-throwing tour of Tibet.
Now I need to decide on a deck to get for this. Not sure whether to get the Neon Moon Tarot or a more traditional one.
I use a thoth deck as well as a My Little Pony: Friendship is Magic themed one. The pony deck seems to be the most effective.
I always misread thoth deck as thot deck.
It is neat to see all the different decks people have come out with, but the Alchemical deck is the one I grew up with.
There’s always Steven Jackson’s Silicon Valley Tarot: http://www.aeclectic.net/tarot/cards/steve-jackson-games.shtml
If you want to learn more about tarot, acquire a a book or two on the subject and a deck. Then draw a card daily and read about it in the book during a spare moment.
Holistic Tarot and Alejandro Jodorowsky’s Way of the Tarot are both good.
I’m surprised by the continuous quality posting and commenting on here after a long seven years. Happy birthday!
I agree with your sentiment but think it’s better to be self-policing rather than having explicit rules in this case. In the case of both of your examples, they are low-quality posts (basically just “some people’s short opinions”) that have been downvoted to below or equal to 0 points.
Following your logic that we should submit “a link to the official announcement’s site,” it would be foolish to outright ban a domain considering there may be some twitter thread which functions as a primary source. You can see some examples of this by searching by the twitter domain on lobste.rs.
rob pike /bin/truehas the top result linking to the twitter thread.
They’re accomplishing the same thing with the Google Home appliance, where every time you use it you must say the word “(Hey) Google.”
The company could absolutely let you choose a different wake word like Amazon’s Alexa, but it’s advantageous to keep the company on people’s lips.
On-topic: I’m only on the second page of this paper and I’m already loving it. I was about to go to bed, but I’m definitely not going to bed until I’ve read this paper now!
Minor suggestion: It would probably be better to link to the arxiv page for the paper rather than the PDF itself. By linking to the arxiv page people can read the abstract, see related papers, see information about the authors, see bibtex, and a whole lot of other things without downloading the PDF. It’s also very easy to go from the arxiv page to the PDF and I don’t know of any decent way to go from the PDF to the arxiv page.
There’s an identifier on the left side of the page that you can search e.g. 1903.00982 in this case.
Heh, what’s your stance on bedtime now, having made it to page 11? I would want a good night’s sleep before tackling that one.
I’ve started doing this too as of two months ago, but have it in the form of daily posts on a hugo-generated public-facing blog, all generated from
yyyymmdd.md files in a git repo. The practice has been great for a number of reasons:
Like the author of this post, I use headers like “Chess,” “Movies,” and “Things I’m Liking” in most entries, so it should be trivial to scrape the journal entries to generate a time-series list of all the text or unordered lists under the headings into a data structure to generate a master list, for searching, or for later analysis.
But the biggest downside to this is that it’s public, so I can’t post anything ethically questionable, illegal, or too personal in fear of open source intelligence collection or future employers. So I think I’ll adopt the same method of a bunch of plain text documents on different topics for more personal notes.
I find the single most useful thing has been using it as a place to ask myself questions, and answer them.
Never thought of this myself, seems like a great strategy for more in-depth analysis of a topic than what I’ve been doing.
I think the future is going to be like China’s Alipay or Wechat pay, where you have logged into a payment app on your phone and scan a QR code on the webpage to pay, then enter a pin on your phone or webpage. These apps integrate with your credit/debit card or bank account and act as middlemen. This works well in China and makes buying stuff on taobao or takeout or monthly subscriptions very easy, or even scanning qr codes in real life for buying things at stores.
My guess is that venmo will be the most successful in the US and have some features like this soon due to its popularity with people my age. It already cornered the person-person digital payments so its next obvious evolution is integrating with other apps or in real life. It’s an option on uber as of a month ago so it’s on its way there.
At this point you have the kind of usability the article pontificates about.
I’ve been doing this in python and it’s been a blast. For day 3 it introduced me to numpy and matrices! Of course, it’s humbling seeing other’s solutions and how much more elegant their solutions are, but that’s been a great learning opportunity too.
Don’t forget there’s a lobsters leaderboard!
Started learning Chinese (characters/pinyin) again. It only took the better part of a decade to realize that I should just learn a helluvalot of vocabulary. I’ll also be reading through The Rust Programming Language and diddling on Yousician for piano, which has been pretty great.
Regarding learning languages - has anyone had success with learning vocabulary for multiple languages at once?
Some tips as a Chinese learner that has been studying Chinese full time in China for a half year so far and longer 国外:
It’s a really hard language that’s inseparable from the culture, especially at a higher level when you’re using 成语。If possible find a way to study Chinese in China, it’s cheap compared to US education even at the best universities and you can make money on the side by a (illegal) English teaching gig. Good luck!!
As a budding go programmer just out of college this has been one of the most useful posts to help me understand the language and best practices, especially that scatter/gather pattern. Thanks for spending time preparing for the presentation and kudos for posting it here!
A good resource for additional patterns is https://gobyexample.com/.
I love rust and I don’t think it is hyped, really. But why go and defend Java? Just hurts the argument IMO.
While not every Java prophesy came true as foretold, I think it was very successful overall.
Android runs on Java, and it’s the most popular consumer OS in the world. So billions of devices really do use Java. It is write once, and run on a bunch of devices with vastly different hardware, and even Chromebooks and Windows 11. For over a decade it was probably the only sensible option for high-performance servers.
Java looked really good in that company. And today despite much bigger choice of languages available, Java is still one of the most popular ones.
The book “modern C++” was published in 1992. Unfortunately I can’t actually find a reference to that book online. As I recall it had a purple cover.
I think of https://github.com/isocpp/CppCoreGuidelines when I hear “modern C++”.
I thought the “modern C++” phrase originated with Alexandrescu’s book, Modern C++ Design, published in 2001.
“There are only two kinds of languages: the ones people complain about and the ones nobody uses”
Java is probably the most critical programming language in the enterprise space, the backbone of the majority of mobile devices (Android), and was used to create the best-selling video game of all time (Minecraft). Its time is waning, but it lived up the hype.
I don’t think these things are related. Surez Java is entrenched and sure it’s very popular in some verticals, but it hasn’t managed to become popular for the things C was popular for (mostly) and the run everywhere thing sort of fell flat as x86 crushed everyone. I’m not sure I would say it close to “lived up to the hype” but maybe it depends on one’s memories kof the hype.
Looking back now, I’d say it did. Normalizing managed code and garbage collection alone would qualify, but add robust, cross platform concurrency primitives and stable, cross platform GUI, classloaders and all the stuff about OSGi… I resent it for killing off Smalltalk’s niche, but it moved a much larger piece of software development in a good direction.
You’re looking at niches that C has kept, rather than all the uses that C lost to Java. C used to be the default for most applications, not only low-level and performance-critical ones.
On mobile, where “Wintel” didn’t have a stronghold, J2ME has crushed, and delivered some portability across devices with vastly different (and crappy) hardware.
“Become popular for the things C was popular for” is kind of an impossible standard to hold any language to. Back in the day when Java was new, C was popular for everything. I know I personally wrote or worked on multiple backend business-logic-heavy services in C that would have been much better fits for Java had it existed in mature form at the time.
Even at the height of Java’s early hype, I can’t remember anyone credibly suggesting it would replace C outright.
Write once, run everywhere is still valuable. My team develops on MacOS, Windows, and Linux, and we deploy to a mix of x86 servers and on-premises low-power ARM devices (think Raspberry Pi). The same JVM bytecode (Kotlin, not Java, in our case) works identically enough in all those environments that we haven’t felt any urge to push people toward a single OS for dev environments. We have far more headaches with OS incompatibilities in our Python code than in our Kotlin code, though admittedly we’re not doing the same stuff in both languages.
This seems slightly ahistorical — before C’s niche was God of the Gaps-ed into “tiny performant chunks of operating systems where zero copy memory twiddling is critical” it was everywhere. Like, people were writing web applications in C. People were doing things with C (and the godforsaken C-with-objects style of C++) that nobody today would go near in an unmanaged language. It was Java that showed the way. And I am no fan of Java-the-language.
Which is itself funny because everything good about Java was available elsewhere. The difference was the giant education campaign and campaign to embed the vm everywhere.
Oh, I know.
Now we have AWS Graviton, I found Java people do have easier time.
I think that nobody can deny that Java had been widely successful.
If it lived up to the hype, then we first have to define what the hype was. If I remember correctly, our was first hyped for applets. Java’s successes have been elsewhere.