Love calculator api

The Love Calculator provides a score from 0% to 100% that is meant to be an indication of a match in terms of love, based on the names of two people. The higher the percentage, the better the match. Note that like all other love calculators on the Internet, this calculator is intended for amusement only rather than as a real indication of love. What is a Love Calculator? The Love Calculator is an affective and scientific way to get an impression or idea about the chances of a relationship between two people. It provides a random score ranges from 0% to 100%, that is basically meant to be an indication of a perfect match in terms of love which is based on the name of two people. ... Fill in your and his/hers name and find out what are the chances for the two of you! Love Calculator - Test Your Love By Name Or Date Of Birth. Life itself is a calculation of events and their resultants. The calculation of compatibilities is what every couple looks forward to. It is a major factor to be considered before entering into a life long relationship. You do measure all the pros and cons of everything before stepping ... Download Love Calculator apk 1.0 for Android. Love Calculator The Love Calculator is an affective way to get an impression of what the chances are on a relationship between two people. To find out what the chances for you and your dream partner are, just fill in both full names (both first and last name) in the two text boxes below, and press Calculate.

Fall three times and stand up four (detailed)

2020.09.16 09:32 V3rbolten Fall three times and stand up four (detailed)

After 3 takes I finally passed today (9.15). I can’t believe I did it & how I did it. This exam was a challenge for me from the very beginning not to mention my focus problems, not a fast reader & a brain who’s more inclined to the arts than the analytical side but I really wanted this.
Ok before I share, SMASHEM920 whoever you are (i’m new here & can’t find you...i tried), THANK YOU! HUGS!!!! I read your post/reply from August 14 & it HELPED me a lot.
1) PrepCast: Subsribed to them after my first failure (initially used PmTraining but questions from PmPrepcast were “kind of” similar to the actual) *watched the Cornelius’ videos & took notes (love his British accent) *actually found ITTO pages online, printed it, took notes on those, & made a binder *Note that 90% of the REAL PMP questions are still worded differently than the simulations & today I had 1 question which was given to me previously (it was a change request one).
2) While studying my ITTO binder (i can send you a sample page lol), I had Rita, PMBOK, & Crosswind book as reference *I also took a Crosswind class & their book highlights the important ITTO’s for the 49 processes & those were reflected on my binder as well *I tried to memorize the ITTO’s but my poor brain just can’t so I just really tried hard to understand how the ITTO’s relate/connect
3) I watched Praizion’s Main Line video on youtube (AGAIN, THANK YOU SMASHEM920! ) * PLEASE WATCH THIS, YOU WON’T REGRET IT! This guy also makes videos explaining PMP processes while driving (That’s some focus!) It’s entertaining & very educational, TRUST ME!
4) YESSS SMASHEM920 WAS RIGHT, FAMILIARIZE YOURSELF WITH PAGE 89 of PMBOK * When I opened that page, I got a confused after seeing a chart, but just familiarize yourself *Since I also have a difficulty memorizing, after each Proj Doc, I wrote what process it came from (Ex. Basis of Estimates- EAD, EC, EAR)
5) Simulation Exams: I only took 2 simulations this time *Rita- although hers stopped after question 89 like the real exams, the page lagged often while the timer kept going, the questions were ok but not like the exams, & the explanation to the answers weren’t enough. The explanations to me were very important since I want to know why I got it wrong. No, I did not pass this simulation.
*PmPrepcast- I retook the simulation of the worst score I had from last year. I scored a 62% last year to 75% which wasn’t still were I need to be (this was before watching praizon, & going through my binder). By the way, PmPrepcast have THE BEST ANSWER EXPLANATIONS! I love how they explain why other choices are incorrect & although you may argue with some, they still have explanations for that.
6) Almost everyday I would write down the process chart (6x10) w/ the 5 PM Processes, 10 Knowledge Areas & 49 Processes
7) I brushed up on the formulas the last 3 days before the exam. This was my 4th so I already remembered majority of them. If this is your first time, start studying these waaaaaay in advance.
8) I also got this last year from Amazon 8 Pages Quick Reference Guide - Project Management Professional (PMP) Certification Exam - 6th Edition Updated - March 2018
We drove 2hrs for my 12nn test. Entered the test center at 11:30pm, read the rules, checked-in, signed, took photo & placed items in the locker (wallet, water, passport holder & peanuts). Oh you also need to wear a mask while taking the exam (A MUST!).
After the proctor checked my pockets & glasses, I was provided a laminated binded sheet (4-5 pages with both sides blank to write on) & pen. I also asked for a Kleenex. They can only give me one, I don’t know why but it was all I needed. I folded the kleenex to insert it on top of my blue surgical mask so my glasses won’t get foggy when I breathe (Yes, I went to Youtube for tips last weekend).
The proctor escorted me to a computer station. The computer station has a noise cancelling headphones (similar to what my FiL uses when he mows the lawn), and no mousepad. Used the headphones before but it was too tight it hurt my head & neck but don’t worry because you can also ask for earplugs (the blue foam ones).
The proctor will log-in the computer & set-up. It will start with a tutorial on how to take the exam. After the 7min tutorial, you can proceed to start the exam (but first, take a deep breath).
If you have a brain dump, you can only start writing when you see the 1st question. And nope, I did not have a brain dump although I memorized formulas & the 6x10 process chart, I can’t write them down because time is too precious for me (refer to 1st paragraph of this post lol).
On the test screen... I hope I remember these right Upper Left Clickables (From L-R): *Comment- you can actually comment on the question, if it’s vague, or whatever you want to express & it will be sent to PMI. OK, who has time to comment? If someone did, i’m curious to what the comment was (PMI,’s me).
*Highlight (yellow)- you can click this after you highlight an answer choice you pick or you want to discard (whatever you choose my friend)
*Strikethrough- highlight an answer choice you want to strike & click strikethrough
*Calculator- for the Maths c”,)
After question 89, the screen will ask you if you want to review, click yes, it will take you to the review page where you’ll see the questions you flagged & unanswered (it will say incomplete in red). For me, the unanswered/incompletes ones where the Math ones that I had to analyze &/ solve (TIP: if you know a question will take you more than a min, flag it & come back to it). After you’re done reviewing, click end review & the screen will tell you that there is a 10-min break, if you want to take it, click break, if not, click next & continue the test. It also tells you that if you take the break, you won’t be able to go back to the 89 questions you answered. Of course, I needed a break! Are you kidding me?! I was 7 questions behind & it already felt like 3hrs.
If you decide to take the break, raise your hand & the proctor will wave at you to come out (pre-COVID19 proctors will enter the testing room to escort you out). Leave the laminated sheets with the pen & take your locker key & passport with you. Make sure you look at a clock somewhere to keep track of time. 10 mins is not a lot but I’m glad I took it. Remember the peanuts I brought with me? You better believe it, I had some during the break & water. Did some stretches (neck & upper back stretches, Yoga Eagle Pose with hands only).
When the break is done, the proctor needs to check you in again but this time I walked back to my station alone, took a deep breath & carried on with the battle. Ok so if you took the break & came back late, the computer will start up & show question 90, the timer starts again, until you show up (I was thinking of Samuel Jackson Tick Tock mother haha).
Ok soo when you are nearing the end of the exam, the screen will tell you that you have 5 minutes left & the timer on your upper right will turn red too. Finally, being 7-12 questions behind the entire time, I finally caught up during the last 10 questions (I needed Tums). After answering question 200, I had 30 seconds to spare & had to let go of the questions I had flagged.
When the timer runs out, click end exam. Seriously, PMI doesn’t play. Before you even say your prayers, BAM! you’ll be greeted by a Congratualtions you passed, wait for your results 1-3 days etc, or Unfortunately, you did not pass at this time, try again. I’ve seen both screens, & the latter one hurts every time. Today, I saw Congratulations! I don’t know how but I was so happy that I silently cried in front of the test pc.
1) If you passed the first time... CONGRATULATIONS!!!!!
2) If you haven’t passed...take a break, give yourself a couple days BUT KEEP GOING!!!!
3) Schedule your exam day as soon as you can, works for someone who needs a little bit of a push / who procrastinates
4) Use Pnemonics- mine was “I Saw Santa Claus Quietly Rapping Coolio’s Rapper’s Paradise Song” (Gangsta wasn’t possible so I had to make it work)
5) Do something else besides studying. I learned how to sew masks. I still ended up using the blue surgical one since the elastic is light & doesn’t weigh on my ears. Just take a break from studying.
6) Read comments on Reddits of people who passed. This was hard for me to do but I wouldn’t have seen SMASHEM920’s post if I let my feelings take over. Make a list of the suggestions, try them & see what sticks / works for you
7) Please wear something comfortable on exam day. It helps.
8) If you get sleepy after lunch on a normal day, do not eat anything fried & sweet on your exam day. Carbs make you sleepy. Eat a lot of protein. I baked chicken breast, had chickpeas, greens, boiled egg, brown rice (not kidding, & it was not yum but I chose to eat bland yesterday). I also had to do all my food prep the day before & stuck those in the fridge.
9) Couple days before or the day before (which I did), prepare ALL stuff you need to take during the exam day (ID, passport, h20, food, snacks, meds). Oh the meds, I brought dramamine (make sure it’s NON DROWSY), tums, advil, even icy hot for my neck & a tennis ball to lean on my back during the 2hr drive to the testing center. This is your day, do whatever makes you feel comfy.
Hmmm I think that’s it. Feel free to comment for questions. Hugs!!!!!!!
submitted by V3rbolten to pmp [link] [comments]

2020.09.15 15:00 Sapphire_Rapids Announcing [email protected] - the best way to create and share new spreadsheets and apps. Build on the Aspire platform and share your work with the community!

Hey everyone,
I am incredibly excited to be announcing [email protected] today. In Aspire's nearly three years of existence, I have loved seeing the creativity, ingenuity, and passion for budgeting/finance in the Aspire community. Many of you have shared right here in this subreddit your custom solutions, tools, and spreadsheets. These tools range from the Aspire mobile apps, to CLI tools, to spreadsheets for savings, forecasting, and other use cases.
With [email protected], there will now be a dedicated website/repository for all these shared projects. I can't wait to see what you all build.

How to get started and quick start links

I've drafted a few pages dedicated to the community guidelines for the [email protected] program. The links to these are below. In essence, these docs detail out the types of extensions you can make, API / named range information you can leverage, and other details. These guidelines are in flux as I refine them, but the basics are all there. These guidelines exist to ensure all the builders have a basic target to aim for and to help all the Aspire users have a consistent experience (as they are the people we're helping and building for).
Please use the Aspire Budget v3.3 Preview for the base of your extensions.
Read the [email protected] docs →
Review the available API details and Named Ranges →

Types of extensions

If you've used Aspire, you're familiar with spreadsheets. Spreadsheets are incredibly versatile. I see endless possibilities for extensions with forecasting models, savings calculators, new types of dashboards and reports, insights and analysis tools, and more.
I've even created a starter template with the basics for you.
Apps / Mobile apps
This is a pretty wide category, but this could include anything from website utilities to mobile apps. We've had a few passionate users create magnificent mobile apps already. If starting a new app isn't for you, I know they would love your help and collaboration.
CLI tools
Command line interface tools are runtimes and scripts that are invoked through a terminal window or other text based program. I've seen a handful of these made for Aspire and they range from parsing bank information to merging data from multiple sources. CLI tools are a great way to accomplish technical tasks.

How do I get my tool in the [email protected] repository?

I've created a GitHub repository that's dedicated to [email protected]. You can learn more here.

What's the timeline?

I'll immediately start reviewing anything that gets submitted on the GitHub page. The target is December 1st to officially bring the [email protected] online for all Aspire users and consumers.

Other questions?

Leave a comment below with any questions/comments. I've started the [email protected] Reddit chatroom for any tinkerers that want to share ideas and get started (we'll start with Reddit chat - we may find that a tool like Discord is better suited for this purpose).
submitted by Sapphire_Rapids to aspirebudgeting [link] [comments]

2020.09.13 12:56 Mautriz I've been trying out Nexus Framework (Typescript) for the last days and it seems amazing

I've been looking for a backend framework for graphql for a while and I'll list the main ones I've tried with the problems I've faced
Nestjs(Graphql) - Not really well documented, had many troubles setting up different stuff, ex: dataloaders, file upload, imported custom scalars, had to bring in a decorator orm, has some code duplication, current choices are TypeORM (which is slowly dying) and MikroORM (which is not very much used compared to typeorm, but seems fine).
Typegraphql - Better documentations and probably bigger community (graphql-wise), same problem as above with the orms and code duplication, has many built-in features that I don't like/use and is relatively slow (or at least it has been in the past, I think it kind of got better now).
Postgraphile - seems like a great option, is perfectly customizable, amazing performance, makes you design the database in a great way and autogenerates logic for crud operations, It's generally definitely a good choice, problem being the tooling around building custom resolvers/gql and postgres which doesn't help too much with the autocomplete (you CAN do it but requires some workarounds from what I understood).
Now, what I was looking for was a typescript-first framework that could auto-generate crud mutations/queries but remaining fully customizable while being type-safe and have the possibility to do custom queries to the db with its db layer with as less boilerplate as possible.
Nexus Graphql - nexus uses (with its main plugin at least) prisma2 for its database access, which I personally love, the api is a joy to work with imo. It does have a crud plugin for easily generating FULLY CUSTOMIZABLE crud operations, by fully customizable I mean that you can easily change the input shape as you want, make computed properties or reutilize the full generate query wrapped by your resolver SUPER EASILY, you can reuse the same crud generator with different aliases and customization ,you can pretty much make the vast majority (if not all) of your api by just reutilizing the crud api making nested relations and connections (relay connections as well) easy even in custom resolvers. The api seems a little strange at first but I've started preferring it over decorators after not much, it removes a lot of code duplication while remaining simple and type-safe without even explicitly declaring types, same goes for context and whatever else.
The only part where there is some code duplication is between defining the schema (in the prisma model file) and defining your graphql api (for example, you want to omit the password field in the api), this remains type-safe and with code completion tho, not really a pain.
There are other useful plugins that are easy and with near 0 boilerplate to use, for example authorizing fields, or adding complexity calculator.
It is backed by prisma, the negative part is that it is still not stable (0.26), so it has some breaking changes often, and documentation is not full yet
TLDR: Overall what I love about it is how easy and super fast to build with no boilerplate everything seems to be, while being fully customizable like any other custom server and type-safe db access. Really, seems like everything I was looking for in the past months.
Really, if you like the prisma2 api definitely try this one out, it'll be a joy, the api seems strange at first, but I promise you'll prefer it over decorators after realizing how much it simplifies things.
DISCLAIMER: I'm not related to nexus or prisma in any way, just a random guy who is loving the framework and hopes its community will grow
submitted by Mautriz to graphql [link] [comments]

2020.09.13 08:03 dl_supertroll Jensen Huang welcomes us to his kitchen

Jensen Huang: (01:13) (silence) Welcome to my kitchen. I hope all of you are staying safe. We’re going to talk about an amazing GPU today. Modern GPUs are technology marvels. It is the engine of large industries from design, cloud AI, to scientific computing, but it is the gamers and their insatiable demand that is the driving force of the GPU, pooling their GPUs to create the largest distributed computer ever. A million gamers united to counterstrike the COVID-19 Coronavirus. The result was 2.8 exoflops, five times the processing power of the world’s largest supercomputer, to simulate the virus. Folding At Home was able to simulate a hundred milliseconds, a 10th of a second in the life of the coronavirus and captured the moment it opens his mouth to infect the human cell. Scientists believe this is also its moment of weakness. Jensen Huang: (02:05) Thank you all for joining this historic fight. We’re going to talk about computer graphics and the work we’re doing to push the boundaries. We love computer graphics and have advanced it incredibly in the time of Nvidia. As the technology advanced, the expressiveness of the medium has made graphics an invaluable tool to help us understand our world, create and explore new worlds. Tell stories that inspire us. From science to industry to the arts, computer graphics has made a profound impact on the world. And for that, we are privileged to have contributed. Jensen Huang: (02:38) We’re going to talk about gaming and the infinite ways that gaming is expanding. G-Force PC gaming is large and thriving. It’s open and rapidly advancing technology, combined with the amazing creativity of the community makes magic. Anyone could be a broadcaster. Add a G-Force and you have a personal broadcast station, pros stream their practices, experts stream tips and tricks, friends stream to friends just to hang out. There are over 20 million streamers. Games have become a new art medium. In Minecraft gamers can build their work of art. Machinima artists create cinematics made from game assets. Tens of millions are using games to express their creativity. Inside a computer simulation, any sport can become e-sport. Virtual NASCAR and F1 are already attracting top racers. Like sports, e-sports captures the thrill of victory and the agony of defeat and the human drama of athletic competition. E-sports is on its way to be the biggest sport. Jensen Huang: (03:38) I have something special for all the G-Force gamers around the world, four gifts. I hope you like them, and you’ll find new ways to game. First, big news. Fortnite is turning RTX on. Now Minecraft and Fortnite, the number one and number two most played games in the world have RTX on. Fortnight will get Ray trace, shadows, reflections, ambient inclusion, and DLSS too. These effects look fantastic with the art style of Fortnight. I can’t wait to see a Fortnight concert with RTX on. The last one with Travis Scott was watched by 28 million people. Epic made a trailer for you. Let’s play it now. 75% of G-Force gamers play e-sports. e-sports is a game of milliseconds, reaction times a combination of the gamer and the machine. Let me explain. This is Valerie. In this example, the opponent is traveling at 1500 pixels per second, and it’s visible in this opening for only 180 milliseconds. A typical gamer has a reaction time of 150 milliseconds, from photon to action. You can only hit this opponent if your PC adds less than 30 milliseconds. Most gamers have latencies far greater than 30 milliseconds. Many up to 100 milliseconds. Jensen Huang: (05:12) Today we’re announcing a new e-sports technology called Nvidia reflex. Nvideo reflex optimizes the rendering pipeline across CPU and GPU to reduce latency by up to 50%. in September, we’re releasing reflex with our game ready driver. Over 100 million G-Force gamers will instantly become more competitive. Valarent, Fortnight, Apex Legends, Call Of Duty War Zone, and Destiny Two will be the first to integrate reflex technology. Jensen Huang: (05:40) E-sports pros and enthusiasts strive for zero latency. For you, we’re announcing an insanely fast and beautiful display. A 360 Hertz Gsync display designed for e-sports. This display has a builtin precision latency analyzer. Just connect your mouse. The Nvidia 360 Hertz Gsync e-sports displays are arriving this fall from Acer, Alienware, Asus and MSI. We’ve made a video comparing gaming on a 60 Hertz, 144 Hertz and 360 Hertz display. You can see immediately how 360 Hertz display will help you target and track an opponent. Jensen Huang: (06:21) For the 20 million live streamers, we have something really cool for you. Nvideo broadcast turns any room into a broadcast studio. Nvideo broadcast runs AI algorithms trained by deep learning on NVIDIA’s DGX supercomputer, one of the most powerful in the world. Effects like audio noise removal, virtual background effects, whether graphics or video and web cam auto framing is a virtual camera person tracking you. Jensen Huang: (06:44) These AI effects are amazing. Available for download in September and runs on any RTX GPU. Brandon and G-Force marketing will now show you in a video broadcast. Brandon: (06:55) Hey everybody. I’m Brandon and I’m very excited today to talk to you about our Nvidia broadcast app. Like many of you I’ve been home a lot more lately. I’ve been video conferencing all day and then gaming and streaming all night. And I have a very basic webcam microphone set up. Nvidia broadcast makes these things supercharged with a lot of new awesome features that really bring it out, using the power of AI and our RX GPUs. The first one ,I want to talk about is noise removal. So I’ve asked my girlfriend to join me with a blow dryer here and that distracting sound makes it very hard to understand what I’m saying, but when I turn on noise removal in Nvidia broadcast, you find that it’s completely gone. And that blow dryer is still going. Brandon: (07:32) But Nvidia broadcast isn’t just awesome audio features. There’s some really exciting video features as well. Let’s take a look. First up, we have the ability to blur your background, which you may notice that I need because I have a very cluttered and messy room. But when I turn this background blur feature on, all of a sudden I get this really classy effect and I can adjust the strength of that from low to high and everything in between. Or if I want, I could actually replace the background altogether. Now I’m in a space station with the magic of AI. It’s that easy. Or if I want to jump into some gameplay, I can remove the background altogether and jump into some Valoran. And now I’m playing with a green screen effect without actually having to have one at home. I don’t have to play good, but at least I can look good. Sometimes when I’m video conferencing or doing a just chatting stream, I want to zoom in to get a more personal connection with the audience. But the problem is, I bounce around so much, it’s easy for my head to get out of frame. With the auto frame feature, it’s like having your own personal cameraman that follows you wherever you go. So if, for example, I want it to reach over and grab my cool Valoren hat, and show it to everybody, it follows me every step of the way. I just find Nvidia broadcast to be really exciting, as both a streamer and as someone who works from home. The ability to remove distracting noise, improve your background and keep yourself in the center of the frame are all awesome features in one app. And I just can’t wait for you guys to try it. Jensen Huang: (09:05) A new form of art has emerged from gaming called Machinima. Artists are using game assets to create cinematics. There’s been tens of billions of views on YouTube. Most are shorts. Some are even recreating entire classic movies. It’s becoming a whole new art genre. Today, I’m going to show you an app that will make these cinematics amazing. It’s called Nvidia Omniverse Machinima. It’s an app build on our omniverse 3d workflow collaboration platform. Omniverse is a universal design tool asset exchange with a viewer, based on photorealistic path tracing. The engine is designed to be physically accurate, simulating light, physics, material and artificial intelligence. We have connectors for most third party design tools, like 3DS Max, Maya, Photoshop, Epic Unreal, Rhino, and many more. The Machinima app brings in elements and assets from games and third party collections like turbo squid, and lets you mix and compose them into a cinemtic. Jensen Huang: (10:03) … [inaudible 00:10:00] like TurboSquid and lets you mix and compose them into a cinematic. Creators can use their webcam to drive our AI-based post-estimator to animate characters, drive face animation AI with your voice, add high fidelity physics like particles and fluids, make materials physically accurate, and then when done with your composition and mixing, render film quality cinematics with your RTX GPU. NVIDIA Omniverse Machinima, beta in October. Sign up at machinima. Jensen Huang: (10:31) Let me show you a demo. We created it in a few days. We started with assets from Mount & Blade II: Bannerlord. You’re going to love this. Speaker 1: (11:15) Whoa, that was close. You guys are getting better. Jensen Huang: (11:28) For 40 years since NVIDIA researcher Turner Whitted first published his paper on ray tracing, computer science researchers have chased this dream to create super-realistic virtual worlds with real-time ray tracing. NVIDIA seeing the ultimate limits of rasterisation approaching focused intense efforts over the past 10 years to realize real-time ray tracing on a large scale. Jensen Huang: (11:49) At SIGGRAPH two years ago, we announced the NVIDIA RTX. Now two years later, it is clear we have reinvented computer graphics. NVIDIA RTX is a full- stack invention. RTX starts with a brand new GPU architecture, but it is so much more. It includes new engine tech and a bunch of new rendering algorithms. RTX is a home run. All major 3D APIs have been extended for RTX. RTX is supported by all major 3D tools. RTX tech is incorporated into all major game engines. There are hundreds of games in development and thousands of research papers of new rendering and AI algorithms enabled by RTX. The RTX GPU has three fundamental processors: The programmable shader that we first introduced over 15 years ago, RT core to accelerate the rate triangle and ray-bounding box intersections and AI processing pipeline called tensor core. Tensor core accelerates linear algebra that is used for deep neural network processing, the foundation of modern AI. Jensen Huang: (12:52) AI is the most powerful technology force of our time. Computers that learn from data and write software that no humans can. The advances are nothing short of breathtaking. NVIDIA is doing groundbreaking work in this area. You might have seen our work in self-driving cars and robotics. Computer graphics and gaming will also be revolutionized by deep learning. Let me show you some recent works and the art of the possible. Jensen Huang: (13:15) The first video is a generative adversarial network that has learned to synthesize virtual characters of any artistic genre, including photorealistic. Second is a neuro network that animates a 3D face directly from voice. Speaker 2: (13:29) You require more Vespene gas. It’s dangerous to go alone. Take this. Jensen Huang: (13:35) The AI character can speak in any language, be any gender and even rap and sing. Jensen Huang: (13:41) Third is a character locomotion of infinite number of positions. Imagine negotiating arbitrary paths and obstacles. The fourth is reconstructing 3D from video. Imagine the possibilities, record video, interact in 3D. Jensen Huang: (13:59) This one is a deep learning model that learned the physics behavior of cloth animation. Finally, this deep learning model of ray tracing can predict colors of missing pixels so that fewer rays need to be cast and fewer pixels need to be fully rendered. We can achieve orders of magnitude speedups. AI is starting to play a giant role in the future of computer graphics and gaming. The powerful tensor cores in RTS GPUs will let us do AI in real time. Jensen Huang: (14:27) One of the first major AI computer graphics breakthroughs is DLSS. Here’s the challenge, real-time ray tracing is far more beautiful, but requires a lot more computation per pixel than rasterisation. The solution is to ray trace fewer pixels and use AI on tensor course to up res to super res, to a higher resolution and boost frame rate. Jensen Huang: (14:50) DLSS took nearly two years of intensive research. We built a supercomputer to train a network. The DLSS model is trained on extremely high-quality 16K offline rendered images of many kinds of content. Once trained, the model is downloaded into your driver. At runtime, DLSS 2.0 takes in low resolution aliased image and motion vector of the current frame and the high resolution previous frame to generate a high resolution current frame. Jensen Huang: (15:18) I think DLSS is one of our biggest breakthroughs in the last 10 years. Take a look at these images of Death Stranding, the latest game by Kojima Son. DLSS is sharper than native 4k and create a detail from AI that native rendering didn’t even show and the frame rate is higher. Jensen Huang: (15:36) Reviewers have loved DLSS 2.0. They say its quality beats out native rendering and runs even faster. You can play a 4k without a performance hit. Tensor core effectively gives RTX a two X performance boost. Let’s look at one frame trace of a game to see the processes of RTX in action. Jensen Huang: (15:55) Adding ray tracing to games dramatically increases the computation workload. Using shaders to do rate traversal and object intersection reduces the frame rate. We added the RT core, which reduces shared workload by 60%. RT core offloads the shaders by doing that ray triangle and ray-bounding box intersection calculations. Using the same methodology as Microsoft Xbox, the RT core is effectively a 34 teraflop shader and Turing has an equivalent of 45 teraflops while ray tracing. Jensen Huang: (16:27) Even with RT core the amount of time consumed is significant, so RT core and shaders have to run concurrently. Even then, 20 milliseconds is only 50 frames per second and still a step back and performance relative to previous generations. This is where the tensor core and DLSS come in, rendering to a lower resolution then using AI and super-fast tensor core to effectively double frame rate. Now you can get ray tracing, get high results and high frame rate at the same time. That’s the magic of the three processors of RTX. Jensen Huang: (17:03) Turing was our first-generation RTX GPU, combining ray tracing, programmable shading and AI. The flagship Turing had a ton of processing power: 11 shader teraflops, 34 RTT teraflops and 89 tensor teraflops. Jensen Huang: (17:20) Let me show you our new RTX GPU. Ampere is a giant leap in performance. Ampere does two shader calculations per clock versus one on Turing. 30 shader teraflops compared to 11. Ampere doubles ray triangle intersection throughput. Ampere’s RT core delivers 58 RT teraflops compared to Turing’s 34, and Ampere’s new tensor core automatically identifies and removes less important DNN weights. The new tensor core hardware processed the sparse network at twice the rate of Turing, 238 tensor flops compared to 89. Jensen Huang: (17:59) Ladies and gentlemen, NVIDIA’s Ampere GPU. Our second-generation RTX, 28 billion transistors built on Samsung [inaudible 00:18:09] NVIDIA custom process. All three processors double rates over Turing, a triple double. It connects to Micron’s new G6X, the fastest memories ever made. Jensen Huang: (18:20) The days of just relying on transistor performance scaling is over. Yet Amperes an incredible two times the performance and energy efficiency of Turing. At Nvidia, we use every engineering lever to squeeze every drop of performance out of the system, from architecture custom process design, circuit design, logic design, packaging, custom series IO, memory, power, and thermal design, PCB design software and algorithms. Thousands of engineers per generation, billions of dollars. Full-stack engineering and extreme craftsmanship is the hallmark of our GPS. Our performance, energy efficiency and low power are all world-class, and real application performance highlights Ampere’s new RT core. The more ray tracing is done, the greater the Ampere speed up. Ampere RT core doubles ray intersection processing. It’s ray tracing is process concurrently with shading and Ampere can render cinematic images with motion blur eight times faster than Turing. Let’s take a look at Ampere in action. Jensen Huang: (19:25) At our kitchen GTC a few months ago, we showed Marbles, the world’s first fully path-traced, photorealistic, real-time graphics. It was running on our highest end Turing Quadro RTX 8000. Turing was doing 720p, 25 frames per second. Today, we’re going to run an enhanced version of Marbles with even more special effects, and it is running at 1440p, 30 frames per second, over four times the performance. Jensen Huang: (19:56) Ladies and gentlemen, enjoy Marbles At Night. Jensen Huang: (21:17) Marbles is entirely path traced, no rasterization, all real time. There are hundreds of area lights, including spherical area lights. There’s no pre baking. Everything is dynamic. The depth of field is film quality and beautiful. Everything is dynamic. Diffuse GI, all dynamic. Jensen Huang: (21:46) There are hundreds of [bridge a bonds 00:21:49], 80 million triangles, materials are physically accurate, physics simulation and volume metric rendering in real time. DLSS 2.0 is doing the super resolution and AIG noising. Let’s compare Marbles Turing and Marbles Ampere. You could see dramatic visual quality jump of Ampere. Marbles on Turing runs at 720p, 25 frames per second. Marbles on Ampere runs a 1440p, 30 frames per second, more than four times the performance, and Ampere even did area lights and depth of field. A giant performance leap. Jensen Huang: (22:50) Today’s games are giant worlds, indoor and out, with photogrammetry, dense geometry and lots of characters. Games are over 200 gigabytes getting bigger. This is like 50,000 songs or 400 hours of streaming video. Games have pushed PCIO and file system sort of breaking point. Jensen Huang: (23:08) CPS copy files from disk can decompress the game image. This is fine when the story system was slow, 50 to 100 megabytes per second. Now with gen four PCI express and solid state drives PCs can transfer data at seven gigabytes per second, a hundred times faster. CPU copying data to memory and decompressing game images is now the bottleneck. Decompressing data from 100 megabytes per second hard drives takes only a few CPU cores. However, decompressing from seven gigabytes per second SSDs on PCIE gen four takes over 20 CPU cores. Today we’re announcing Nvidia RTX IO with three new advances: new IO APIs for fast loading and streaming directly from SSD to GPU memory, GPU losses decompression, and collaboration with Microsoft on direct storage for windows that streamlines the transfer of data from storage to GPU memory. Jensen Huang: (24:02) With Nvidia RTX IO, vast worlds will load instantly. Picking up where you left off will be instant. This is a very big deal for next generation gaming. Let me show you Ampere in action in one of the most anticipated games of 2020 CD Projekt Red’s Cyberpunk. This trailer is called scenes of cyberpunk RTX. It shows ray trace reflections, diffuse elimination, shadows, and ambient occlusion, and DLSS 2.0, enjoy. Ladies and gentlemen, our new flagship GPU, the Nvidia G-Force RTX 3080 powered by Ampere, second generation RTX architecture. The Nvidia RTX 3080. I have one right here. Let me show it to you. It is beautiful. Look at this, the RTX 3080. It is wonderfully crafted. It’s going to look beautiful in your PC, and it lights up. Jensen Huang: (27:53) Now, let me tell you about some of the other exciting technologies inside. Turing uses G6, the fastest memories at that time. The industry thought that was the limit. For Ampere, we had to push through that limit, working with Micron, we designed the world’s first memories with PAM4 signaling, pulse amplitude modulation with four voltage levels that encode two bits of data each, 00011011. Jensen Huang: (28:18) Each voltage step is only 250 millivolts, so in the same period of time G6X can transmit twice as much data as G6. PAM4 is extreme singling technology, and it’s just becoming used in high speed networking. The Ampere thermal architecture is the first ever flow through design, working harmoniously with PC chassis cooling system, pulling in cool air from the outside, flowing through the GPU, and pushing hot air straight out the chassis. To allow room for a fan to flow air directly through the module, our engineers architect a super dense PCB design that is 50% smaller than previous, while adding the bigger Ampere GPUs, HDMI 2.1, PCI express 4.0 and G6X. Jensen Huang: (29:05) There are two independently controlled fans, the bracket front fan pulls cool air from the bottom and pushes the heated air out through the graphics card brackets. A backside pull-through fan passes cool air over the fence of the heat pipe and directs the hot air to the top and back of the chassis to be exhausted by the system fan. The 3080 flow-through system is three times quieter and keeps the GPU 20 degrees cooler than the Turing design. It can cool 90 Watts more than Turing. Jensen Huang: (29:35) The generational leap is ultimately the most important factor of new GPUs. A significant technology advance is needed to inspire content developers to create the next level of content and for the install-base to upgrade. Let’s see how the 3080 stacks up the previous generation architectures on the latest graphics intensive games. 3080 is faster than 20 ADTI. 3080 is twice the performance of 2080 at the same price, Ampere is the … Jensen Huang: (30:03) It’s a 2080 at the same price. Ampere is the biggest generational leap we’ve ever had. Ladies and gentlemen, Nvidia G-Force RTX 3080, our new flagship GPU. Powered by Ampere, our second generation RTX GPU architecture. Incredible amounts of processing in the shader, RT ray tracing core and tensor core for processing AI, 10 gigabytes of G6X, twice the processing power of 2080, and at the same price, starting at $699. Available September 17. One of our most popular GPUs is the 70 series, 970, 1070, 2070 were all hugely popular. You’re going to love the new RTX 3070, faster than the 2080 TI, the Turing enthusiast GPU priced at $1,200. Ladies and gentlemen, the new G-Force RTX 3070. Let me show it to you. Jensen Huang: (31:05) It’s a work of art. 20 shader teraflops, 40 RT teraflops, and 163 teraflops tensor core for AI processing. With eight gigabytes of G6, RTX 3070 is faster than the $1,200 RTX 2080 TI, starting at $499. Available in October. Every generation we pack in our best ideas to increase performance while introducing new features that enhance image quality. Every couple of generations, the stars aligned as it did with Pascal, and we get a giant generational leap. Pascal was known as the perfect 10. Pascal was a huge success and set a very high bar. It took the super family of Turing to meaningfully exceed Pascal on game performances without ray tracing. With ray tracing turned on, Pascal, using programmable shaders to compute ray triangle intersections, fell far behind Turing’s RT core, and Turing with ray tracing on reached the same performance as Pascal with ray tracing off. Jensen Huang: (32:11) On a technical basis, this was a huge achievement. The images are far more beautiful and reflection and shadow artifacts are gone, but gamers wanted more. They want every generation to be more realistic and higher frame rate at the same time. So we doubled down on everything, twice the shader, twice the ray tracing, and twice the tensor core, the triple double. Ampere knocks the daylights out of Pascal on ray tracing, and even with ray tracing on, crushes Pascal in frame rate. To all my Pascal gamer friends, it is safe to upgrade now. Amazing ray tracing games are coming. Activision and developer Treyarch are launching a new Call of Duty on November 13th. It’s a masterpiece and it looks incredible. They’re dynamic lights, ray tracing, shadows and ambient occlusion, DLSS 2.0, and Nvidia reflex super low latency technology. The last call of duty sold an amazing 30 million copies. Activision put together this trailer of never before seen footage. Enjoy. Let me talk to you about one more thing. Several years ago, we started building the Titan, pushing the GPU to the absolute limit to create the best graphics card of that generation. It was built in limited quantities, only through Nvidia. The distribution was limited. The demand surprised us. Creatives were making 4k movies, rendering cinematics, researchers built workstations for data science and AI, bloggers built broadcast workstations, flight and racing simulation fans built sim rigs. There is clearly a need for a giant GPU that is available all over the world. So we made a giant Ampere. Ladies and gentlemen, the RTX 3090. Come here, come here, papa. All right. 3090 is a beast, a ferocious GPU, a BFGPU, 36 shader teraflops, 69 RT teraflops, 285 tensor teraflops, and it comes with a massive 24 gigabytes of G6X. It comes with a silencer, a three slot dual axle flow through design, 10 times quieter, and keeps the GPU 30 degrees cooler than the Titan RTX design, but there’s more. The 3090 is so big that for the very first time we can play games at 60 frames per second in 8K. This is insane. Because it’s impossible for us to show you what it looks like on the stream, we invited some friends to check it out. Roll the clip. Speaker 3: (36:05) I’ve never been more excited to do anything. Speaker 4: (36:07) Oh. Speaker 3: (36:07) Are you kidding me? Speaker 4: (36:11) Oh my gosh. Speaker 5: (36:12) Oh my God. Speaker 6: (36:14) No way. Speaker 3: (36:15) This is f***ing incredible, dude. Speaker 5: (36:17) This is amazing. [inaudible 00:36:20] This is silly. Speaker 6: (36:24) My god, you can see Raymond’s [inaudible 00:36:27]. Speaker 3: (36:26) Look at this. Why is it so detailed? Speaker 6: (36:30) All right, all right, all right, move fast and shoot things. Speaker 4: (36:33) This is 8K, sir. I can see everything. Oh, I need a shoot you, though. Speaker 3: (36:36) Not a whole lot of people have seen something like this. Speaker 4: (36:38) This is so realistic. I feel like I’m really in battle. Speaker 5: (36:42) This is insane. Speaker 6: (36:44) Die, I want to look at the pretty things. There we go,. Speaker 5: (36:47) Dude, the ray tracing is insane on this. Speaker 3: (36:49) These are the sizzle reels that you see. Speaker 4: (36:51) This is basically hacks. Speaker 3: (36:53) And then it’s like, “It’ll never look like that,” but it does. Speaker 5: (36:57) I’m looking across the vistas, the grand vistas that are happening right now. Speaker 3: (37:01) Holy shit, look at this. Speaker 5: (37:02) This feels like a Disneyland experience. Oh, it is so smooth. It’s butter. Speaker 3: (37:07) Oh, it’s smooth as shit, dude. Speaker 5: (37:09) I can’t believe it’s not butter. Speaker 3: (37:10) I mean, this is game changing. There’s no other way to put it. My mind is blown dude. Wow. Jensen Huang: (37:20) It’s been 20 years since the Nvidia GPU introduced programmable shading. The GPU revolutionized modern computer graphics. Developers jumped on and invented clever algorithms, like shaders that simulate realistic materials, or post-processing effects for soft shadows, ambient inclusion, and reflections. Developers pushed the limits of rasterization beyond anyone’s expectations. Meanwhile, Nvidia GPU processing increased a stunning 100,000 fold. Gaming became a powerful technology driver. Gamers grew to billions, and gaming pushed into all aspects of entertainment and culture. If the last 20 years was amazing, the next 20 will seem nothing short of science fiction. Today’s Ampere launch is a giant step into the future. This is our greatest generational leap ever. The second generation Nvidia RTX, fusing programmable shading, ray tracing, and artificial intelligence gives us photorealistic graphics and the highest frame rates at the same time. Jensen Huang: (38:25) Once the holy grail of computer graphics, ray tracing is now the standard, and Ampere is going to bring you joy beyond gaming, and video reflex to improve your response time, and video broadcast turns any room into a studio. An omniverse machinima turns you into an animated filmmaker. We are super pleased with 3070, 3080, and 3090, the first three members of the Ampere generation. You’re going to feel a boost like never before. I can’t wait to go forward 20 years to see what RTX started. Homes will have holodecks. We will beam ourselves through time and space, traveling at the speed of light, sending photons, not atoms. In this future, G-Force is your holodeck, your lightspeed starship, your time machine. In this future, we will look back and realize that it started here. Thank you for joining us today and to all of our fans for celebrating the arrival of Ampere.
submitted by dl_supertroll to copypasta [link] [comments]

2020.09.11 02:52 crogeniks Dear Devs, we need a better RCON API

Who am I?
Quick presentation, my name’s Crogeniks. I am part of the software team at TEAMDIXX, where I mainly focus on tools for Hell Let Loose. I am a software developer by trade, love games and cracking my head on impossible problems.
This post will get a bit technical on the side, so some of you may not understand everything. If you want me to explain something, ask away, I’ll explain to the best of my abilities.
The Demands
Foremost, I want to say right away that I LOVE Hell Let Loose. It is a magnificent game, and the openness of the Dev team is amazing. I’ve played the game since the Free weekend of OctobeNovember 2019 and had the chance to help the admin team over at TEAMDIXX since February 2020.
Since I’ve had access to the RCON (Remote control admin tooling) in March of 2020, I’ve worked with the server’s API to develop better tooling for the community I’m part of.
I do know other communities are working on the same kind of tool, Dr.WeeD (and all the contributors) comes to mind, with the HLL Community RCON
The Tools Flaws and the Bugs
While no API is perfect, here are some modifications that I think would make the community life’s much easier when working with the API
In addition to the above changes, here’s a wish list of features that I think would be amazing for communities.
EDIT: Formatting
submitted by crogeniks to HellLetLoose [link] [comments]

2020.09.04 19:51 theAviCaster Beginner Project : Kafka + Spark Structured Streaming for Real-Time Updates

I'm about to start my first data engineering project and I'd appreciate any suggestions, recommendations for other technologies or advice on shortcomings.
I'm basically trying to set up a pipeline to source, transform and persist real time cryptocurrency price updates.
submitted by theAviCaster to dataengineering [link] [comments]

2020.09.02 20:37 ytzi13 How MMR and the Matchmaking System Works

Last Updated: 2 September 2020

This guide is up to date for season 14. We don't know exactly what changes are going to happen in the upcoming "Season 1", so we can't assume that everything here is going to be true. I will continue to keep it up to date as I discover new behavior.

Video Alternative

If you don’t feel like reading a wall of text, I would recommend taking a look at this video made by u/RyanGoldfish5. It’s very well put together, easy to follow, and does a great job explaining the system in a way that should answer most of your questions. However, if you want all of the details, have any additional questions that may not have been answers by the video, and want to be sure that you’re getting the most up-to-date information, I would always recommend reading through the following guide and asking questions here if there’s something that isn’t clear.

What is MMR?

MMR, or Matchmaking Rating, is a hidden number value that represents your rank in-game. The rank and division that you see is a visual representation of this MMR value. Each rank represents a range of MMR values, which is then divided further into smaller ranges that we refer to as “divisions”.
For simplicity’s sake, the MMR values I’m using here are not real values, but numbers meant to easily illustrate how things work.
For example, let’s say that each rank represents 100 MMR, each division 25 MMR:
And so on and so forth.

Where Can I See My MMR?

Mods (PC only):
Tracker Websites:

The Buffer Between Ranks

In my previous examples using fake data, we’ve assumed that MMR ranges for ranks are entirely independent from one another. Depending on how you choose to look at it, that’s not actually true. Each rank has a range of possible MMR values where division 1 will overlap with division 4 of the previous rank, and division 4 with overlap with division 1 of the next rank. You can choose to see this range as a rank overlap, or you can choose see each rank as having smaller MMR ranges for division 1 and 4, with a range in-between them that can be considered no rank at all.
For example:
In this case, we look at the void as having no rank at all, or more simply put: a range of MMR where your rank cannot change. If you’re Bronze 1 and you enter the void, you will still be Bronze 1. If you’re Bronze 2 and you enter the void, you will still be Bronze 2. In order to promote from Bronze 1 to Bronze 2, you must pass through the void and reach 101 MMR. In order to demote from Bronze 2 to Bronze 1, you have to pass through the void and reach 86 MMR. So, technically, it’s entirely possible for a Bronze 1 to be rated higher than a Bronze 2, and that’s true when you compare any 2 ranks in succession.
Why do they do this?
I can only speculate, but I've always assumed that it’s most psychological. We used to have ranks with entirely separate thresholds, but it was determined to be unnecessarily frustrating for some people to watch their ranks constantly jump back and forth. This buffer ensures that this frustration won’t happen. If you earn a new rank, you usually have to lose 2-3 games in a row in order to demote back down. If you drop down a rank, you usually have to win 2-3 games in row to get it back. And, unrelated to the change, it’s a nice addition for people trying to get that new season reward for the first time ever.

Is this Why the Tracker Website Says that I Should Have Promoted to the Next Rank, but Haven't Yet?

Yes! The tracker websites work off of an API they've been granted access to by Psyonix. As previously discussed, each rank has a demotion value that overlaps with the previous rank, and a promotion value that overlaps with the next rank above it rank. Tracker sites tend to mistake the demotion value of the next rank for the promotion value of the previous rank. So, using the above example, you may be Bronze 1 div 4 rated 81 MMR and the tracker may tell you that you need just 5 MMR to get to the next rank (86 MMR). But 86 MMR is the demotion value of Bronze 2, not the promotion value of Bronze 1. It's a frustrating occurrence for players who follow their tracker, so be wary!

What Factors Determine My MMR Gains and Losses?

A game’s worth is determined before the game even begins. Your team’s rating will be compared to your opponent’s rating and will calculate MMR values for a win and a loss. Once matchmaking is decided, there is nothing you can do during the course of the game to influence those values. If you win, you gain what the system determined the win was worth to you. If you lose, you lose what the system determined the loss was worth to you. Points don’t matter. Performance doesn’t matter. Your teammates abandoning you doesn’t matter. You abandoning your teammates doesn’t matter. Forfeiting doesn’t matter. Nothing you do during the game will influence those values. Your gains and losses are solely determined by whether you won or lost the game. Period.

How is the Game’s Worth Determined?

The amount of MMR that you win or lose after a game is determined by comparing your own MMR to that of your opponents. Things can get a little complicated from here because parties and rank disparities can actually impact this, but, for now, all that you need to know is that the system calculates a rating for each team and then compares those ratings to determine the odds. A team with a higher rating will be considered the favorite and a team with a lower rating will be considered the under-dog.
For a match that the system deems perfectly fair – each team is identically rated with a 50% chance of winning – each team will both win or lose 9 MMR (more info would be needed to find this exact value, but 9 has always seemed to be the known value, or is very close to it). That would mean that each match is valued at 18 MMR.
At the most basic level, we can guarantee that 3 things are true:
- When matched against an equally rated opponent, you will gain or lose the average amount of MMR (e.g win 9, lose 9).
- When matched against a higher rated opponent, you will gain more or lose less than the average amount of MMR (e.g. win 11, lose 7).
- When matches against a lower rated opponent, you will gain less or lose more than the average amount of MMR (e.g. win 7, lose 11).
As you can see from the examples I provided, whatever MMR you gain in favor is removed from your loss. So, if you win a game and gain 13 MMR in the process, you gained 4 MMR higher than the average value. This means that a loss would have only cost you 5 MMR. This essentially means that the maximum MMR a player can gain from a win is 18 if sigma is normalized (which we’ll discuss next).

Sigma: the Uncertainty Variable

Each individual playlist has its own matchmaking algorithm that determines your gains and losses (one for each competitive playlist and one shared for all casual playlists). In order for the system to determine how confident it is in your placement, it uses a sigma value to apply weight to the matchmaking algorithm and ensure that you get to your appropriate rank as soon as possible. To put it simply, the more games you play in a playlist, the more certain the system can be that you are ranked appropriately. This sigma value starts out high and is gradually reduced with each game played until it reaches its normal value at somewhere in the range of 50-100 games played on a brand new account. So, every game played up until that value is normalized will result in higher MMR gains and higher MMR losses. In other words, your rank will fluctuate more rapidly and appear a lot less stable until you’ve played enough games in a single playlist.
It’s also worth noting that the first 10 games that you play in each playlist on a brand new account are treated differently than any other 10 games you’ll ever play in that playlist. They are worth significantly more points and the matchmaking is unique. I’m not going to dive into this any further.

Is this Why My Teammate is Ranking Up Faster Than Me?

There exists a common scenario that goes something like this:
You party up with a friend for some ranked doubles: you are silver 3 div 3 and they are silver 2 div 2. You guys do really well and win a bunch of games, but at the end of your session you find that he is gold 2 div 1 and you’re stuck at gold 1 div 1. Then, you realize that your friend has only played a total of 20 games in that playlist while you’ve played 100. The sigma value was inflating the number of points that they gained, so they passed you.
Something important to note is that the sigma value is different for brand new players versus what we experience during a reset. Brand new accounts have a much higher sigma value than returning players have.

How Do Season “Resets” Work?

There are 2 things that happen when a new season begins:
  1. If you ended up above the lowest possible MMR value for Champion 3 in any playlist, you are reset back down to that value. For Standard and Doubles playlists, this is commonly known to be 1380 MMR. Everybody else starts the new season exactly where they left off.
  2. Our sigma value for each playlist increases slightly: 0.5 to be exact. A normalized sigma value is 2.5, and the maximum value that it can be after a reset is 3.5 (the sigma value for a brand new account is much, much higher). This sigma value increases MMR gains and losses a slight amount and it lasts somewhere between 15 and 20 games. The first game you play may be worth 50% more (14.5 MMR instead of 9), the 10th game you play may be worth 25% more (11.25 MMR instead of 9), and the 20th game you play should be back to normal.
Simply put: unless you are rated in the top 1-3% of a playlist, your rank doesn’t change at all. The first 20 games you play, per playlist, at the start of a new season are worth more, but not by much. Your rank will only be significantly affected by the season reset if you go on a massive winning or losing streak during your placement matches. It may be worth noting that sigma value adjustments are actually influenced by the sigma value of your opponents, so while it may take between 15 and 20 games to normalize sigma at the start of a season, your sigma will likely normalize faster than that if you choose to wait until later in the season to play (if an opponent has a lower sigma value than you – meaning the system is more certain of their rank – then your sigma value will normalize at a faster rate).

MMR in Parties (Ranked)

There is a lot of confusion on this subject, so I want to be very clear about this.

Matchmaking for parties works as follows:

If you are in a party where a player exists above 1140 MMR, your entire party is rated at the highest player’s rating.
A team consisting of an 1140, 1000, and 600 will be rated equally to a team consisting of 3 1140s.

If you are in a party where no player exists above 1140 MMR, your party’s rating is very heavily weighted towards the highest player in the party.
A team consisting of an 1139, 1000, and a 600 will be rated equally to a team consisting of 3 1068s.

If you are part of an all solo team, your team’s rating is a direct average of all players' MMR values.
A team consisting of an 1140, 1000, and 600 will be rated equally to a team consisting of 3 913s.

If a duo queue is partnered with a solo player, the duo queue is treated by party rules stated above, and then directly averaged with the solo player’s MMR.
A team consisting of a partied 1140 and 600 with a solo 1100 will rated equally to a team consisting of 3 1120s.

What we do know right now is that the 1140 MMR threshold is very much present for the Doubles and Standard playlists, but there has been evidence to support the idea that this 1140 threshold may not apply to competitive extra modes, and that extra modes may always use a weighted average.

MMR in Parties (Casual)

Matchmaking in casual play is always going to be weighted without the hard limit that ranked matchmaking experiences, and it is weighted much more leniently than its ranked counterpart.

The Catch-Up Mechanic

The support site states that a catch-up mechanic exists for lower-skilled party members when “a part member is at least one skill rank below the highest ranked player in the party”. Unfortunately, we don’t really know what’s true here because it hasn’t been confirmed nor specifically tested (that I know of). It's been very obviously observed to be part of the system, but I don't know enough intricate details to get into it much further.

Be Wary of What You Read on the Support Site

I'm not claiming to know more about the inner workings of the game than Psyonix does. That would be silly. But, either way, if you happen to find some conflicting information on the Psyonix support site, I would be cautious about simply assuming that it's true, specifically with regards to matchmaking. While I would love to trust what it says (and it seems to be generally correct) their site has a history of being outdated, over-simplified, or just outright incorrect.
For example:
Special shout out to u/HoraryHellfire2 for all of the related work that he’s done on this specific subject along the way, and for keeping me in the loop on any changes he discovers.

Change Log

submitted by ytzi13 to RocketLeagueYtzi [link] [comments]

2020.08.31 17:59 CantFlyRL Dear Psyonix: A couple suggestions/requests that will greatly improve the QoL of developers of community tools around Rocket League

Hey there !
If you're from Psyonix and are reading this, thank you in advance for your time !

A little introduction

I'm Can't Fly, the developer behind
I'd like to start by saying: RocketLEague is the bestest most awesomest game ever !
This is the game that brought me back to gaming after many years of hiatus.
It was love at first sight, when a friend showed me the game. around 2017.
I've since put 3700 hours in the game, and it would have been much more if not for me launching
The scope of the project was initially very modest: just a simple whole-field viewer of a Rocket League game, to prove to my friend that we kept loosing because he sucked.
I've spent a couple months getting my head around the replay file format,taking inspiration from some of the existing parsers, mostly jjbot's C# parser.
Then I reached my set goal: completely parse a replay file, extract the players and car positions,and render the game in a webpage using three.js, with a barebones 3D representation.
I thought that this could be of interest to the community, so I posted a video preview of the 3D viewer here in reddit, and to my surprise, the post just exploded.
The community's excitement, praise, encouragement and suggestions have fueled me to go much farther than what I initially planned to do, with the site main focus changing from the 3D viewer to stats extraction.
Since then, the site has grown beyond my wildest dreams:
Whew, that was a lengthy introduction.
Back to the meat of the subject: a couple of suggestions that would greatly improve the quality of life of developers providing tools around Rocket League for the community.

1. Add timezone to replay date

I'm referring to the ReplayDate field in the replay file header.

2. Add the season in the replay file

Either in the header, or in the network stream, maybe in the TAGame.GRI_TA class ?
Just to give you an idea of how complex it is today to determine a given replay's season, here's the code I currently use in the codebase:
func For(gameVersion int, licenseeVersion int, date time.Time) int { switch { case licenseeVersion == 29: return 14 case licenseeVersion == 28: switch { case date.Before(time.Date(2020, 3, 25, 18, 0, 0, 0, time.UTC)): return 13 default: return 14 } case gameVersion == 0: switch { case date.Before(time.Date(2016, 2, 10, 1, 0, 0, 0, time.UTC)): return 1 case date.Before(time.Date(2016, 6, 21, 2, 0, 0, 0, time.UTC)): return 2 default: return 0 } case gameVersion >= 30: switch { case date.Before(time.Date(2019, 8, 27, 17, 0, 0, 0, time.UTC)): return 11 default: return 12 } case gameVersion >= 29: return 11 case gameVersion >= 27: switch { case date.Before(time.Date(2019, 5, 13, 1, 0, 0, 0, time.UTC)): return 10 default: return 11 } case gameVersion >= 25: return 9 case gameVersion >= 24: return 8 case gameVersion >= 22: return 7 case gameVersion >= 21: return 6 case gameVersion >= 20: return 5 case gameVersion >= 18: return 4 case gameVersion >= 12: return 3 case gameVersion >= 10: return 10 default: return 0 } } 
As you can see, this is very fragile, and downright wrong, due to the fact that replay dates do not contain the timezone.

3. Store player ranks in replays

This would be huge !
This would solve 87.354% of the uses cases of an official API, I think.
Also, it would spare people like me from resorting to ... let's say not-very-savory ways to get this info from other unofficial sources.

4. This one might be trickier: access to pre-builds

Would you be open to set up a system, say a vetted community with access to pre-builds of the game before it goes live ?
The goal is to leave us time to prepare our parsers for changes in the replay file format.
Currently, whenever a new version is released with a change in the replay file format, all the community tools (,, the open source parsers, the various mods, etc.) become broken.
Depending on the maintainer's timezone, this could happen at an inconvenient time.Also, it can take a little while to figure out the change and fix the parser accordingly.
Meanwhile, replays fail to parse and keep queuing up on the server, depriving the community from tools they rely on (many tournaments use the API for their bookkeeping).
Once the fix is pushed, it would still take the server a while to reprocess the thousands of replays that were uploaded in the meantime.

That is all !

If you've reached this far, thank you again for your time & hopefully this post falls on receptive ears !
Also, thank you again for making this game, for keeping it alive for this long and for the time & effort you're putting in interacting and listening to the community ❤️
RocketLeagueRocks !
submitted by CantFlyRL to RocketLeague [link] [comments]

2020.08.24 14:58 Michair-Vn Michair Company , Best your choice

Michair Company , Best your choice

Sale shock - Discount $30/kg

Save the date 08.26Th at 4:00 PM – 5:30 PM (Vietnam time zone) to buy the hair you are looking for with unprecedented price.
  • Let's follow us on facebook and instagram to make sure that you don't miss our special live stream.
  • Discount $30/1kg for the first 5 customers buying via livestream
  • All customers who purchase hair via livestream can receive gifts from Michair such as Beautiful Eyelashes, Massage Combs, and many many lovely gifts...
  • The discount applies to a maximum of 5 kg of hair. If your order is over 5 Kg, the remaining weight will be calculated according to the normal price.
  • If you register to buy hair during the livestream period but do not pay after 48 hours, this offer will not be applied.
  • Step 1: Like and share Livestream in public mode
  • Step 2:
Like Fanpage MICHAIR.VN:
And follow Instagram: @michair_company_vn
  • Step 3:
- Comment in format: I want to order + Whatsapp number If you watch livestream on Facebook
- Direct in format: I want to order + Whatsapp number If you watch livestream on Instagram
Don’t miss this unprecedented opportunity, only once in Livestream. Look forward to getting the BEST DEAL.
For more information, please contact with us via whatsapp: +84962279910
Thank for your reading!
submitted by Michair-Vn to u/Michair-Vn [link] [comments]

2020.08.24 01:45 xeznok The State of Packet Inspection Tools

I've put together this post as mainly a brain dump of information I've gathered on Albion packet inspection (or "sniffing") and the tools that are allowed to do it. My goal is to get the community aligned in knowing what's possible, what's permitted, and why SBI's communication on this matter has been frustrating for many developers. I don't expect a lot of people to read the whole thing, but I feel it's important to have all this information in one place. I aim to be as objective as possible, though some bias may sneak through (and is generally noted) as I have personally been slammed by SBI's ambiguous policies. Please call me out on anything that seems overly biased and I will update accordingly.

Who Am I?

I'm Reznok, a co-author of the recently released ROAM (Roads of Avalon Mapp) tool. I have direct connections and relationships with Savage, 3mpire, and NEWBY leadership, and am the current guild leader of "Join Baka", though I try to avoid in-game politics as much as possible. I consider myself one of the most knowledgeable people for tool development based off of packet inspection. I have written multiple private/limited release tools such as loot loggers, chest value estimators (RIP), sovereignty map updaters, fame per hour calculators, detailed player inspectors (show death fame, equipped gear value, guild histories, etc. when inspecting manually), and map navigation tools. I have also recently publicly released the ROAM tool (along with theblackavenger) which included a client application to detect ROA portal connections.

What Is "Packet Inspection"?

Packet Inspection / Packet Sniffing is the process of automatically monitoring the game traffic between Albion's servers and a game client. This can be done with generic tools such as Wireshark/Winpcap or tcpdump or using hardware solutions such as network taps or firewalls with logging. For almost every action in the game, there is a communication between the game client and the game server. Typically, you will see a flow such as:
  1. Client requests to do something (move the player, check the market, send a chat message, etc.)
  2. Server validates that the client can do this thing and executes the action on the server
  3. Server sends the results of this action to all affected clients (player x is now at y position, the market results are a/b/c, you receive a new message, etc.)
All of these actions are data sent across the internet in packets. These actions, with SBI's implementation, are all conducted in plain-text with no encryption or intentional obfuscation. This means anyone able to read these packets can see exactly what data is being passed back and forth between their client and the server. This is done completely external to the game client its self and it does not require reading any in-game memory or process state.

What is SBI's Stance on Packet Inspection?

This is the most frustrating question to try to answer. The short version is that Packet Inspection is allowed, except for when it's not. To start, I'll begin with the TOS ( section 13.3:
Users may only play the Game personally. They are not permitted to manipulate the Game or use technical tools that give Users an advantage over other players. In particular Users are forbidden to… ... - use software enabling “data mining” or which intercepts or captures data otherwise in connection with the Website and the Game.
This should make it open and shut right here, packet inspection is forbidden and anyone doing it is directly breaking the TOS. However, we then get to dev responses on the forums:
There's a lot to unpack here between MadDave and Mytherceria's posts. Some key takeaways though:
  1. Packet Inspection, as long as it is limited to inspection and not modification, is fine and completely undetectable. It's impossible to detect, and the data is not intended to be secret to the client, so MadDave originally came out saying build whatever you want based on it.
  2. In February 2020 Mytherceria put some bounds on this. Anything that gives a direct benefit in PvE, gathering, or PvP, or somehow modifies the game/data or provides an in-game overlay is prohibited. No definitions are provided for "direct benefits". It's also stated in her post that there are cases of "Clearly Cheating" and "Clearly Okay", which is an extremely subjective thing to say in an official post. She suggests e-mailing albion support for future clarification, but as anyone who has dealt with support in the past can attest to, this is an arduous and slow process.
  3. It's not the Packet Inspection that's restricted, it's what is done with the data (which at that point is 100% completely out of the game and SBI's control), that is restricted.
So in short summary: SBI has admitted to being unable to detect Packet Inspection tools but will ban you if they detect the use of a tool that falls in their subjective definitions of cheating.
In an attempt to clear this up with SBI previously, I have already made the following request to Albion Support, which was ignored:
  1. Update the Terms and Conditions Section 13.3 to explicitly define what data is permitted to be captured. or
  2. Enforce the prohibition of all packet capturing tools.

What Tools Exist That Utilize Packet Inspection?

I'll be using this section to go over some approved and rejected tools.

Approved Tools

Albion Data Project

The largest, most well-known, and most-used project has to be the Albion Data Project ( This tool gathers mostly auction-house specific data and uploads it a central server. The description from their site:
The goal of this project is to collect and distribute realtime information for Albion Online. This is achieved with a downloadable client that monitors network traffic specifically for Albion Online, identifies the relevant information, and then ships it off to a central server which distributes the information to anyone who wants it.
Broderick has truly gone above and beyond with the tools he has developed for the Albion Data Project. He has actively maintained the market data servers and many tools developed to assist in gathering of Albion data (including the best raw data-mining tool out there: and
This tool provides publicly available APIs for market data on any item in the game at any location in the game, all sourced from running the data collection client. An example of what this data looks like can be seen:,T5_BAG?locations=Caerleon,Bridgewatch&qualities=2
Many, many, many tools have been written that utilize this data collection. I have personally witnessed and written tools that you punch in two cities (Fort Sterling and Bridgewatch for example), and using this data will dump out a shopping list of what to buy in city A and sell in city B as well as your expected profit margins (typically in the 10s of millions for each transport).
In my opinion, and many others, this gives an insane PvP advantage to any players utilizing these tools as the best ones are not publicly available and people are gaining millions and millions of silver each day using them.

Albion Online Stats

This is a handy DPS meter and fame per hour tracker: It does exactly what you would expect of a DPS meter and is all around an awesome project.

Rejected Tools


An Albion in-game overlay radar that shows you players and resources that are just outside your view range but you're receiving network traffic for. I won't provide a link for this one, as this one was rejected; and even with SBI's subjective definitions, it very obviously falls in the cheating category. However, it's worth noting that this project exists and there have been multiple highly-featured spin offs of it.

Roads of Avalon Mapping Tools

Recently to this post (23-08-2020), SBI has come out saying that automatic packet inspection of anything related to Roads of Avalon portal data is explicitly forbidden ( This destroyed many tools that were doing this that were in use by ROAM, Mango (Newby's tool), CIR, and many others I'm sure. The reasoning why this is forbidden is due to SBI taking the stance of ROA maps providing an unfair PvP advantage over other players. There was also a broad sweeping statement saying that generating maps/routes between any portal data is prohibited (, but as mentioned before, once you have the data (either from packet inspection or manual means), control of that data is completely out of the control of SBI.
In my, admittedly biased, personal opinion: The advantages gained from tools like this are minuscule when compared to the advantages provided by the Albion Data Project. I am of the belief that SBI's current extreme reaction is due to not expecting publicly available fully-featured mapping tools this soon. However, anyone who has played Eve saw tools like this coming the second ROA was announced.

What's Next for Packet Inspection?

I'm hoping that by this point if you somehow managed to read this whole thing, you'll understand the frustration that developers find themselves in. We're forced to abide by ambiguous and arbitrary rules that, as written, make no sense. With this in mind, I again make my same request to SBI:
  1. Update the Terms and Conditions Section 13.3 to explicitly define what data is permitted to be captured and used or
  2. Enforce the prohibition of all packet capturing tools.
SBI, I would love to have a discussion with you guys about the capabilities of packet inspection tools and help work with you to clearly define the bounds of permitted packet inspection. You can reach me on discord at Reznok#0001, or by e-mail (you have it from my submitted support requests).
To the community, this post is only the beginning for me. I plan on releasing boiler-plate projects and tutorials to start building your own Packet Inspection tools. These will show developers how to monitor the game events and deal with the data so that anyone can start easily developing their own tools without having to tear through the approved projects source code like I had to.
submitted by xeznok to albiononline [link] [comments]

2020.08.19 21:30 cscqsim_repostbot How I landed my first SWE job without CS degree & bootcamp

Long time lurker, first time posting (using a new account)
Some background: I do have a business degree from a state school. My first job out of college was in the digital media industry. I have some friends in the tech industry I consulted before making this decision.
6 months ago, I quit my job as a video produceeditor to pursue software engineering. I had already been studying after work hours and weekends for about 4 months. I considered attending bootcamp, but by the time I left my job, I felt I was already far enough along that if I went to bootcamp I wouldn't be getting the most out of it. Not to mention the high cost and bad timing (quarantine), so I decided to keep going with self learning.
I treated programming like my day job, studying and building projects from 9am - 8pm, and sometimes even on weekends. I used online resources such as freecodecamp, frontend masters, and scrimba to learning this stack:
HTML/CSS/SCSS, JavaScript, React, Redux, Typescript, Node, Express, MongoDB, GraphQL, etc.
I loved learning new technologies, but I loved building projects even more. That's when I was really able to learn and be comfortable with programming languages and frameworks. I found myself really immersed in the world of software engineering, exploring my creativity through coding.
One of my favorite projects I built was a dog breed quiz. The application fetched ten random images from a dog images API and generated multiple choice answers for players to choose. Points were calculated by the amount of correct answers and the time it took to complete the quiz. A scoreboard allowed players to see where players' dog breed knowledge stood.
My most recent project was one I was really proud of. I built an e-commerce site for my dog. I found this project really challenging because I took on learning GraphQL, TypeScript, SCSS, and Shopify's API.
I showcased all 12 of my projects on my website portfolio. I knew that since I never had formal education in computer science, I needed to stand out somehow.
The job hunt started in late June. I was pretty discouraged since it didn't seem like there were many junior positions (compared to countless senior opportunities) due to Covid-19. I totally get it; onboarding and training junior programmers is tough!
65 applications, 3 interviews later, I got a job as a front end software engineer! The interview process consisted of a phone screen, React technical interview, and a cultural interview with the CTO.
This journey has been stressful, yet fulfilling, and I can't wait to start my career in the tech industry. I've read a lot of posts here on this subreddit, and it has made me feel less alone and I thank you all for that :)
*EDIT: Previously wrote 5 interviews, but actually had 3 interviews. 1 interview at a company I passed all three rounds. Counted that into the 5. My apologies.
submitted by cscqsim_repostbot to CSCQSimulator [link] [comments]

2020.08.19 02:30 rileysbonesaw Looking for an approach regarding pattern matching using music meta data

Hey folks,
I would like to ask for a hint nudging me in the right direction regarding how to compare a set of values with others for similarity.
The concrete goal is to use my existing tags for playlists on the rest of my existing, non-tagged music. I know I can just search a song and it will tell me it's a Rap/Metal/Indie/Pop song but having lots of cross-genre playlists, I'd love to use my existing tags to help me when tagging the next song.
Lots of songs are possible to find on Spotify, lots are searchable on other databases. I am currently experimenting with the Spotify API and I will probably use MusicBrainz as well in the future.
What I have so far with a bit of detail but not too much.I wrote a shell script allowing me to search a track on Spotify via artist and track name and then piping the resulting track and artist URIs into the next Spotify APIs in order to get some info on the track and the artist in general.
Artist genre:
Track analysis:
This is the current output
❯ pls spotify search "Nine Inch Nails" "head hole" spotify:artist:0X380XXQSNBYuleKzav5UO Nine Inch Nails [ "alternative metal", "alternative rock", "cyberpunk", "electronic rock", "industrial", "industrial metal", "industrial rock", "nu metal", "rock" ] spotify:track:3ckd4YA4LcD3j50rfIVwUe Head Like A Hole { "danceability": 0.663, "energy": 0.792, "key": 9, "loudness": -11.255, "mode": 1, "speechiness": 0.0456, "acousticness": 0.00787, "instrumentalness": 0.00183, "liveness": 0.582, "valence": 0.443, "tempo": 115.386, "duration_ms": 299640, "time_signature": 4 } 
What I would do next, is to store this information alongside with an array of playlists each tagged song is already in. Using this stored data I would be able to search for an non-tagged song, get the metadata and calculate a similarity/relevance score for the track regarding each playlist.
I am however not sure how to approach this. Pattern matching makes me think of machine learning. Then again I have a feeling I could probably solve this in plain old excel. Or maybe spin up an ElasticSearch somewhere and perform searches against that? And in the end there's probably a jquery function exactly for this already.
I would be very happy for some experiences regarding such pattern matching because I have the feeling I could very easily run down an overly complicated path while not knowing about a rather simple approach.
EDIT: Broken formatting
submitted by rileysbonesaw to learnprogramming [link] [comments]

2020.08.18 19:19 codingcorgi How I landed my first SWE job without CS degree & bootcamp

Long time lurker, first time posting (using a new account)
Some background: I do have a business degree from a state school. My first job out of college was in the digital media industry. I have some friends in the tech industry I consulted before making this decision.
6 months ago, I quit my job as a video produceeditor to pursue software engineering. I had already been studying after work hours and weekends for about 4 months. I considered attending bootcamp, but by the time I left my job, I felt I was already far enough along that if I went to bootcamp I wouldn't be getting the most out of it. Not to mention the high cost and bad timing (quarantine), so I decided to keep going with self learning.
I treated programming like my day job, studying and building projects from 9am - 8pm, and sometimes even on weekends. I used online resources such as freecodecamp, frontend masters, and scrimba to learning this stack:
HTML/CSS/SCSS, JavaScript, React, Redux, Typescript, Node, Express, MongoDB, GraphQL, etc.
I loved learning new technologies, but I loved building projects even more. That's when I was really able to learn and be comfortable with programming languages and frameworks. I found myself really immersed in the world of software engineering, exploring my creativity through coding.
One of my favorite projects I built was a dog breed quiz. The application fetched ten random images from a dog images API and generated multiple choice answers for players to choose. Points were calculated by the amount of correct answers and the time it took to complete the quiz. A scoreboard allowed players to see where players' dog breed knowledge stood.
My most recent project was one I was really proud of. I built an e-commerce site for my dog. I found this project really challenging because I took on learning GraphQL, TypeScript, SCSS, and Shopify's API.
I showcased all 12 of my projects on my website portfolio. I knew that since I never had formal education in computer science, I needed to stand out somehow.
The job hunt started in late June. I was pretty discouraged since it didn't seem like there were many junior positions (compared to countless senior opportunities) due to Covid-19. I totally get it; onboarding and training junior programmers is tough!
65 applications, 3 interviews later, I got a job as a front end software engineer! The interview process consisted of a phone screen, React technical interview, and a cultural interview with the CTO.
This journey has been stressful, yet fulfilling, and I can't wait to start my career in the tech industry. I've read a lot of posts here on this subreddit, and it has made me feel less alone and I thank you all for that :)
*EDIT: Previously wrote 5 interviews, but actually had 3 interviews. 1 interview at a company I passed all three rounds. Counted that into the 5. My apologies.
submitted by codingcorgi to cscareerquestions [link] [comments]

2020.08.15 20:41 m_ologin Just finished my very first game using Godot! Meet LittleBigFactory, a 64x64 factorio tribute!

Just finished my very first game using Godot! Meet LittleBigFactory, a 64x64 factorio tribute!
You can try out the game here:
A play session takes roughly 1 hour. You can learn how to play within the game.
Code, art, music and sound are all open source (MIT) if you are curious on how to make a game like Factorio.

A small piece of my CPU factory
This was my first time using the Godot Engine to make a game. In the past I've played mostly with Unity, Pico8, Love and MonoGame.
What I loved about Godot:
  • GDScript was surprisingly pleasant albeit I really missed strong typing
  • Having offline instant access to API docs is AMAZING, reminiscent of the old QBasic days
  • The ease of making something pixel-perfect compared to Unity was refreshing
  • The core engine philosophies around object composition. Not necessarily new for gamedev but really well thought-out
  • The fast dev cycle due to the ability to debug single scenes really fast
What was challenging:
  • Making a management game with heavy reliance on complex UI in 64x64 pixels required some thinking
  • FPS starts dropping to the low 10s when you hit the ~20K in-game moving objects, which can make the end-game challenging when the factory gets complex. With more time, I would get rid of all the collision detection altogether. This is what they do in Factorio, where a objects simply follow each other rather than calculate their own trajectory and bounds.
Had a blast making this during my evening hours, will definitely reuse the engine for other hobby projects!
submitted by m_ologin to godot [link] [comments]

2020.08.14 17:15 SaraCaterina Thank you. Thank you thank you thank you thank you THANK YOU.

Update 2: well this is the last time I try to be positive in here. Thanks for all the awards, and thank you to those who understood where I went with this post. I'll go back to posting screenshots and the rest can go back to endless complaining.
Update: some of y'all just don't get it. You mistake this post for me praising CIG and blindly ignoring the issues with the game. I'm not. I'm simply thanking people (which btw, I'm thanking more than just CIG if you bothered to read the rest of the post) for what they've done and for what we have now. There's nothing wrong with a little hope and positivity, and those telling me I'm the problem have obviously never looked in the mirror.
You know what there isn't enough of in this game? Appreciation. I get that things maybe aren't where we want them to be, but sometimes to appreciate this game we need to take a look at what the game used to be, how far it has come along, and the future plans we can look forward to. With that said, there are a lot of people to thank that all contribute to this awesome game and the community surrounding it.
Thank you, CIG, for making this wonderful game possible.
I bet you guys are under a lot of pressure every day, dealing with people complaining about how long this ship is taking to finish and that bug is taking to be fixed and that feature is taking to be implemented. I am aware that core tech is being built from the ground up, and that those things take time, and sure you could always pull APIs out of a hat like other games and get this project done faster but the fact that you guys don't contribute to a level of uniqueness that no other game can match. Sure, communication regarding progress could be better. Sure, certain things are taking longer than expected but you know what? Things happen, and it's proof that you guys are human like the rest of us. Keep up the good work, you guys are great people, and I look forward to more cool stuff.
Thank you, Jared, for managing to get in front of a camera every week and helping make ISC videos possible.
People complain about the lack of content in ISC videos, or how it could be better, or how the people who leak content in game files provide the "true Inside Star Citizen" but the fact is, nobody can do it better than you. I get that some weeks there just isn't enough to say, or that there hasn't been anything TOO new to announce or talk about, and I appreciate that you're still willing to faithfully help make these videos a reality. Thank you also to those people willing to film in their own homes, seeing all the devs and designers in PJs talk about this awesome game just adds to my point earlier about how you guys are just regular people like us, working hard to make Star Citizen the game it is now, and what you guys want it to be.
Thank you, CIG Player Relations, Concierge Support, and whoever else helps take care of our support tickets.
Y'all put up with our crazy questions, endless hey-I'm-a-concierge-member-can-I-have-this requests, dealing with all the refunds because X ship apparently isn't flyable yet and the loaner they're getting is less than desirable, people complaining about how this or that is broken but instead of reporting it to the Issue Council they're somehow telling Player Relations, and yeah. It's a lot of work, and a lot of people to help, so thank you. I'd personally like to thank ItsDatBro-CIG for helping me get my 890 Jump back.
Thank you, Evocati, for testing our builds and playing with bugs so we don't have to.
If you're an Avocado, it's most likely because you too, like most of us, have a passion for Star Citizen and you too want to help make this game a reality. You guys deal with all the broken updates and help hammer away at the issues most of us would die having to put up with. You guys are also under pressure to leak information to impatient players, and probably to "hurry up with these bugs so we can release X.XX to all the people saying GIB UPDATE," so for that, thank you. Keep up the great work.
Thank you, Concierge members for dropping thousands into this game to help support development.
I know, I know, there's a lot of controversy surrounding JPEG purchases but a lot of people don't understand what pledging is. I can't emphasize enough that we are NOT. PURCHASING. SHIPS. We pledge to help development and in return, we get early and permanent access to a ship. I mean, they only tell you on everywhere on the site and make you agree to this statement before you pay for anything. If you're aware of PBS, they have a similar pledge system. You pledge a couple hundred dollars, and you get a DVD set or something. It doesn't mean that DVD set costs $300, it's just what you get for helping support them, which in the end is the ultimate goal for spending money this early into Star Citizen. Heck, even those who only spend $45 are contributing. Every penny counts, and remember, this game is crowdfunded so WE are the ones who help fund it.
Thank you to those who take the time to submit bug reports.
Some of us run into a bug and we're like, "Yep it's broken. Time to log off and play something else. Let someone else deal with this." But there are those who take the time to write detailed bug reports and post them to the Issue Council. Thank you, because you guys are the ones that help get things fixed.
Thank you, Reddit/Spectrum mods for giving us a medium to express ourselves.
You guys also deal with a lot, especially Spectrum mods. I can only imagine the moderation manpower it takes to put up with a community that doesn't understand how things work and can only complain. Personally, I love this subreddit and it's the reason I even have a Reddit account.
Thank you, pirates, for making things interesting and giving us something to shoot.
No, I'm not talking about annoying pad rammers, or those who go out of their way to give people a hard time. I'm talking about legitimate pirates who stay true to a code and who love combat, but are also good people. I met two pirates who shot at me during a trade run. I escaped and he complimented me on my evasive maneuvers and quick thinking. He's now in my Discord server offering to run escort and protect me from other pirates. Piracy isn't a bad thing in this game, and sure most of the time it's more griefing than piracy (since you don't really get much out of it other than the other ship blowing up or the other person dying) but hey, it adds to the gameplay and the reason my org has a security division. You guys also use all the broken guns and help CIG determine what needs to be balanced out in terms of general combat. So, thank you.
Thank you, starcitizen_refunds, for giving me something funny to read from time to time.
Honestly, the more of you guys are there, the less of it is here. LOL.
Thank you to the team in charge of bringing the 890 to life.
It's my favorite ship and I love it and I love y'all. The amount of detail you put into such a large object is astounding, and every time I spawn it I'm reminded of all the work put into it. And this goes for any ship really, but the 890 is my personal favorite.
Last but not least, thank you Chris.
You get memed and made fun of a lot but you helped make my dream of flying in space with friends inside of large ships while we explore planets and face the unknown, all while enjoying wonderful sounds and great visuals... possible. Star Citizen is a game like no other; credit will be given where credit is due and you definitely deserve the credit for that.
Okay I know I said that was the last thank you but I have ONE more to thank. The amazing Star Citizen community. YOU.
Despite all the bugs, crashes, deaths, 30Ks, annoying people in global chat, 30Ks, and delays, you guys come back to the game because what you get out of it and the fun you have with friends is worth it and eventually drowns out all the issues you might run into. You guys are willing to help new players feel at home and get situated with the game. You guys are the ones who help create awesome tools like #DPSCalculator and FleetView for us to use. You guys are the ones making cool machinimas and cinematic videos. You guys are the ones who take amazing screenshots to show off just how beautiful this game is. The community consist of some of the coolest people in the world and they deserve to be noticed.
Remember when all we had was a hangar module? Remember when the only mission we could do were comm arrays? Remember when we didn't have large ships? Or when planetary landing was only a dream? Remember when the only fun thing we could do in Star Citizen was meet up at Cry-Astro? If you remember how it used to be, then you'll understand how far the game has come along and those who stuck with the game for that long and appreciate SC for what it is now, you guys are awesome.
Thank you.
submitted by SaraCaterina to starcitizen [link] [comments]

2020.08.14 11:46 alexanderolssen Five tools to build your startup MVP without code.

I think almost everyone in tech loves inspirational stories about startups created by a few geeks in the garage or rented apartments. It motivates to start our own project, but any idea runs into the implementation stage, which can be really tricky part for non-tech founders.
Being also a non-tech guy, I’ve tried to find a way to build my ideas without code. And I found it! This way is called «no-code development» or «visual development» or just «no/zero-code». In simple words, it’s a way to create digital products without writing code (or with minimum code involved) using a platform that allows you to develop functional prototypes (or MVPs) by combining different blocks.
In this article, I will talk about the platforms that will help you build your idea by yourself, without having to learn to code, finding a co-founder, or hiring a developer.


Despite its ambiguous name, the service is interesting for allowing you to create websites not only with pictures and texts but with filters and maps using only Google Sheets!
The service has many templates with which you can quickly create the simplest online store, voting, or collection-based websites.
But it’s better to see once than hear a hundred times, so take a look here for «live» projects made on sheet2site.
The service also has alternatives— table2site and


A feature-rich and relatively easy to pick up platform for creating websites, online stores, blogs, etc., which deserved the love of users for its design capabilities, convenient visual editor, as well as the ease of building and launching websites.
But regular websites and online stores are just the tip of the iceberg.
Webflow has a lot of integrations and the ability to add custom code, which allows you to expand the functionality and create prototypes not only of simple sites with collections, but also more complex projects, such as delivery services, online learning platforms, and even marketplaces.
Here are some integrations that will help you to build more complex websites:
Here are some cool websites made with Webflow, using the integrations mentioned above: Channels Stack, Makerpad, Goodland, Failory.


The most powerful web application development platform on the market right now in my opinion. It not only has a visual editor but also tools for creating a database, logic (backend), and even a feature to work with third-party APIs.
You can easily receive and display data from other services, authorize users via Facebook / Twitter / Google, send data to other services, and much more.
Bubble allows you to create very complex applications with the interaction between several users, such as chats, forums, booking applications, task trackers, marketplaces, CRM, and even dashboards. The list is almost endless.
This tool has a quite steep learning curve, but just take a look at the real projects made with Bubble: NotRealTwitter, Nucode, Vestn, Topshape, Hackerhouse.Paris


A platform for building mobile and web apps that can be published to the App Store, Google Play, or as a Progressive Web App.
With Adalo, you can create attractive and, most importantly, functional applications that can include API, payments, push notifications, database, charts, user authorization, and other cool features, not to mention integration with Zapier, which further expands the platform functionality.
Adalo is suitable for creating a marketplace, social network, calculator for something, booking, you can even wire multiple applications together, which is especially useful for applications where there are a few different user roles, such as seller-buyer or customer-business.
Here are some apps made by Adalo: Primus Fitness, Memolly-subscription manager, Invocial, Support Upstate SC, Cropify.
Adalo isn’t the only platform for building mobile apps. There are several similar app builders on the market, for example, Glide, Thunkable, or Kodika.


A well-known app that allows you to create various workspaces and add blocks to them, such as text, pictures, links, tables, to-do lists, and some others.
Notion is incredibly simple but at the same time functional enough to be used as a prototyping tool for testing simple ideas.
Let’s take a quick look at some Notion features. The service has links that can be attached, for example, to an Amazon product, there are comments which can be used for user communication, there is public access to the pages so you can share the page over the internet, it’s possible to create nested pages, add video and audio, embed various services, and, as the cherry on top, you can have your own domain name with the help of Host Notion or Super to get personal URL.
Just a bit of imagination, and Notion can be a suitable tool for testing a hypothesis.
There aren’t a lot of project examples build with Notion, but you can check the Toolskit platform, which contains educational materials on a variety of topics, and Bookcelerator, now a book collection site that was originally a simple Notion page.

We’re living in a great time when everybody can build something without paying huge amounts of money to agencies, hiring a developer, or spending years learning how to code. No-code is definitely a trend that should spread widely but used wisely. Not everything could be(or should be) build using no/low code platforms. If you need something reliable, scalable, innovative, secure, or complex enough — maybe the traditional coded approach is better.
submitted by alexanderolssen to Entrepreneur [link] [comments]

2020.08.13 10:00 DragonSlayer314159 The Case for Vanguard FTSE All-World UCITS ETF. Finding a blend between US and International Stocks. (Google Sheets Portfolio simulator included)

Some details about this particular ETF I'm going to write about:
This fund was launched on 23 July 2019 and its size already tops 1,018 mil. Euros. To put this in perspective, the Distributing version of this fund, ISIN: IE00B3RBWM25, was launched on 22 May 2012 and has its share class assets are valued at just 4,253 mil. Euros.
This clearly demonstrates that investors really liked the idea of an All-World accumulating fund. Vanguard finally launched it after 7 years from the distributing one, but it’s already gaining momentum.
The most popular UCITS ETF for EU investors is still iShares Core S&P 500 UCITS ETF (Acc), with a tremendous size of 31,772 mil. Euros, the rationale behind it being the outstanding performance of the S&P 500 in the last 12 years, and the statistics behind it telling us that since 1926, the S&P 500 brought investors an annualized return of 9.8%.
But things have not been always this great for the USA. For example, in the 1960s-1990s the US stock market brought the same return as other ex-USA stock markets. Moreover, even if it now has the biggest proportion of Total World Stock Market Capitalization of 56.4%, things were very different in the 1990s, where Japan had nearly 45% of the world stock market, while the US made up 29%. We all know what happened to investors that bet in 1990 on the Japanese stock market for being the most robust at the time.
Vanguard has a lovely section of Investing Research at . This paper, “Global equity investing: The benefits of diversification and sizing your allocation”, was a really nice read on the topic.
In my country there’s a saying: “You never know where the rabbit might pop up from” (China? India? European resurgence? Who knows...). That means, even if the US has now a very diversified and dynamic economy, and half of the S&P 500 companies’ revenue comes from outside the US, and even when the correlation of stock market downturns has increased in the last decades, that still not make up to the fact that one investor is overexposed 100% to the USA, the US tax system, the USD currency fluctuations and only US companies, while ignoring (and missing the gains) of colossal companies such as Alibaba, Tencent, Nestle, Taiwan Semiconductor, Roche, Samsung, Novartis, Toyota…
I’m not all “doom and gloom” on the US economy for the next 40 years (this being the period of a buy-and-hold strategy for retiring with dignity with the help of the stock market), but why take the risk? This is why an All-World index fund weighted by market capitalization (where the USA is still represented with 56.4%) might well be the very best choice for most retail investors. This strategy reduces volatility, reduces the overexposure on the USA economy and currency and is the pinnacle of being diversified (the only free lunch in investing).
Over the last 120 years, global equities have provided an annualized real (i.e., after inflation) return of 5.2% versus 2.0% for bonds and 0.8% for bills. The mean inflation considered in this analysis is 2.8% (yes, including the Weimar inflation), so the total return of world stocks is at 8% annually. This includes the Russian stock market going to zero in 1917 (Thanks, Lenin), and the Chinese one going to zero in 1949 (Thanks, Mao). Source:
I might be wrong. The USA might still be the Word’s capitalist powerhouse that will continue to bring almost 10% annualized return. But I am more comfortable going with an All-World fund that might bring 7-8%, but won’t be a wild ride solely on the US.
Of course, you can still create a Portfolio that has a blend between USA and World Stocks, manipulating the exposure on US stocks to a certain percentage, anywhere between 56.4% and 100%. For example, Jack Bogle said in a 2017 interview that he wouldn’t allocate more than 20% of ex-US stocks to his portfolio. I made an Excel that calculates just that, what is your preferred proportion of US exposure with a blend of VWCE and SXR8 (both trading on XETRA) with a Yahoo Finance embedded API. I’ll post it here. The only variables you need to change are the actual proportion of US stock by market cap (Green cell - Source included) and your preferred proportion (Yellow cells) and your Portfolio value (Blue cell). Down there there is and “acual US exposure” based on the units you hold from both SXR8 and VWCE.
Link here:
As a side-note, the allocation in bonds depends on each and every investor, depending on how strong your stomach and how risk-averse you are. I might transfer my positions from stock ETFs to the iShares Core Global Aggregate Bond UCITS ETF EUR Hedged (Acc) (ISIN: IE00BDBRDM35) as I approach retirement, but that is a topic of the distant future.
Some may point out that replicating an All-World portfolio might be done as well with iShares Core MSCI World UCITS ETF USD (Acc) (ISIN: IE00B4L5Y983) and iShares Core MSCI Emerging Markets IMI UCITS ETF (Acc) (ISIN: IE00BKM4GZ66 ) with a 88%-12% proportion, and a lower average TER (0.20% / 0.18% vs. VWCE’s 0.22%). The only problem is that you need to rebalance accordingly as Emerging Markets will (or will not) have a greater say in the global market capitalization. And, honestly, a difference of 3-4 euros on each 1000 euro in TER is just noise for choosing a fund that rebalances automatically.
In summary, I believe that Vanguard FTSE All-World UCITS ETF (USD) Accumulating will be a very successful ETF in the future and might well be the only ETF you need for riding the All-World stock market until retirement. For example, I am now investing with the help of the Excel above as such that I maintain for now a 80% US allocation, but for my girlfriend I’ve helped her set a buy-and-hold strategy for VWCE only.
Tell me what do you think about it. :)
submitted by DragonSlayer314159 to eupersonalfinance [link] [comments]

2020.08.13 02:20 CoacHdi Are TSLA options mispriced or am I dumb?

Unless I've done something wrong (totally possible) it seems like I could make a really wild trade with TSLA options. The trade involves a debit call spread combined with shorting the stock. The position this makes is kind of (but definitely not) risk-free and is similar to borrowing money at a negative interest rate. I would love to hear your thoughts on this!

So as a newer options trader I've been lamenting the fact that when you buy options you normally get hurt over time due to premium decay. So I decided to see if I could find a long position in an options trade that would instead net me premium.

Right now TSLA is trading at $1562 and I was kind of interested in buying the $1500 TSLA call but it had a huge premium. The option is priced at roughly $300 right now, of which $62 is intrinsic value and $238 is premium. So to offset the premium I investigated creating a debit spread by selling the $1625 Dec 18th call which is priced at $251 (all premium).

Up front this would have cost me $49/share which seems reasonable. But I got really curious looking at the premium portions of the calls.
$238 - $251 = -$13
This means my position would be +$62 intrinsic combined with -$13 premium paid. Diving further into this if TSLA tanked to $1500 or below I would lose $49 but it went up to $1625 or above I would gain $76.
Put another way... if the stock falls $62 I lose $49, if the stock rises $63 I gain $76. No matter what price the stock ended up at I always got the change in the stock price (only between $1500-1625) plus (roughly) $13

Being the greedy capitalist bastard that I am... I thought well why cant I just make the $13 without the risk of the stock at this point, not even WallStreetBets cares about TSLA. So I devised an evil plan... what if at the same time that I enter the debit call spread I also short TSLA. That way I always make my $13 (or part of it) as long as I exit the position before TSLA price goes above $1625 or below $1500. Oh.. and it comes with an added benefit of being able to use the short sale proceeds to buy the position in the first place and the extra can go in something REAL exciting like treasuries.

Now I bet I know what you're thinking a $62 move in TSLA stock is like a rounding error these days and I would probably only earn 1 cent of premium before I would have to sell the position due to reaching the top or bottom end of the spread. Well... you're right but look at the other Dec 18th calls (note that both strikes are roughly symmetrical distance from spot price +490 and -490)
1070 TSLA Dec 18 call costs $557 ($490 intrinsic, $67 premium)
2050 TSLA Dec 18 call costs $142 (all premium)
$67 - $142 = -$75
Shorting 1 share provides me with a credit of $1562
Taking the debit spread (on one share) costs $415
I invest the remaining $1147 at some tiny interest rate (like robinhoods 0.3%APY)
Slowly earn my $67 premium
Wait & Profit??
Exit whole position if it ever gets uncomfortably close to either side of the spread (especially the top end)

Am I missing something? Is this why there are so many short sellers in TSLA stock? Is it free money? Maybe this is because stuff is mispriced due to the market not being open?

By my calculations by using a debit call spread on TSLA and shorting the stock at the same time (with stop losses near the strikes on the spread) someone could borrow money and get paid for it

Also note: I am currently long TSLA stock
EDIT: Thanks for all the comments; the verdict is that I'm dumb definitely more risky than I originally thought!
submitted by CoacHdi to wallstreetbets [link] [comments]

2020.08.12 13:49 surmiran Introducing - KARL: Born Ready

It's been over a year since Deep Rock Galactic provided its employees with Karl's Advanced Remote Loadout, or KARL for short.
Since there's been an influx of new employees recently, management has decided to spend a big sack of your hard earned credits and create "KARL: Born Ready".
With all the features you know and love from the original KARL, but with exciting new functionalities including:
- Saving loadouts to your profile
- Browsing trough loadouts other employees created
- Saluting to show respect to your favorite loadouts
- Coming in the near future: Advanced statistics and calculations for your loadouts and guns. Grenades, perks, pickaxes. Public API with weapon, mod, and overclock data. And many more.

KARL: Born Ready is available on
Employee feedback is appreciated and can be submitted trough discord:

Rock and Stone!

*KARL: Born ready is a collaboration between the creators of drg-builds, DPS Calculator and KARL projects, as well as other dedicated members of the community, that found each other through Karl’s grace on reddit ( This is a fan project and not related to Ghost Ship Games or Coffee Stain Publishing. The content and assets used are theirs but should fall under fair use.
submitted by surmiran to DeepRockGalactic [link] [comments]

2020.08.07 22:04 bigdata_biggersquats Strength of Schedule / Luck in Fantasy :: Python, PHP, APIs

Strength of Schedule / Luck in Fantasy :: Python, PHP, APIs
I love the idea of this subreddit in sharing our approaches and helping each other learn more about programming along the way. One of my goals for 2020 was to learn web development. I figured why not start with a fantasy football project?
I already know python but wanted to learn web development. I decided to convert my old python code to PHP and integrate into a website. Any ESPN fantasy player can use it to connect to their league and find out who had the most lucky/unlucky wins in 2018 or 2019.

I am a data scientist with a foundation of python and R. A few years ago I leveraged mkreiser's awesome python library to connect to ESPN's fantasy API and build a few custom analyses for my leagues. One of the favorites among my friends is the "Luck Analysis" which calculates the degree to which each team got lucky/unlucky wins over the course of the season (basically the same thing as a strength of schedule calculator). However, it did sort of backfire on me when I was 9-0 at one point last year, but my own analysis told everyone in my league that 4 of those wins were complete luck.
Learning Web Development :: Weaning off Python
I decided to learn web development by building a site that lets any ESPN fantasy player connect to their league's data, and the Luck Analysis would be generated for them (without them having to do any programming).
First Try: I attempted to use PythonAnywhere to create a flask app. The app had a form on the front end where a user could enter his/her league details and I would return the data. Cool! When I started looking at integrating this into a WordPress site, I pretty much hit a wall. I'm still not sure if this is possible, so let me know if you know anything about this.
Second Try: Python shortcodes for Wordpress. I learned that most plugins for Wordpress are programmed in PHP. The idea of learning a new language wasn't too daunting for me since I know quite a few already. But from what I was seeing from google and stackoverflow, PHP is a different beast. Therefore I tried to stick to my python guns (pythons?) as much as possible. I found out that it is possible to call python routines (a .py file) from a PHP script. Something like this:

I got this working, but I learned that there were a ton of limitations. For one, passing back and forth between PHP and python is very inefficient in terms of response time. I was also limited in the way in which data is passed from the python call. Basically I could only pass strings back and forth rather than any sort of dataframes. If anyone knows if it possible to pass data back and forth let me know!
Third Try: Full-fledged PHP! After exhausting my python options I figured it was best to step out of my comfort zone and really learn PHP. The way I learned was exclusively though looking at blogs and stackoverflow to learn as I went. I found an amazing library, Guzzle, for making API requests without using a cURL routine. Once I was able to connect to ESPN's API for my own league, I started expanding to see how I could let other users access their leagues. I built a process that lets users enter their ESPN login info to a form. With that, I can get the necessary 2 cookies from the user's profile to make the API call and read the data from their league (works for public and private ESPN leagues). From there I can run the Luck Analysis for any ESPN fantasy user!
For the visualization I used D3JS which is by far my favorite visualization framework. It's built on a combination of JS, HTML, and CSS and makes it easy to build responsive visualizations.
Next Steps
I want to build on this framework to add more analyses. Feel free to check out the site (no ads and everything is free) and even run the analysis for your league if you play on ESPN! Let me know what you think!
Write-up explaining Luck Analysis:
Luck from my league last year
submitted by bigdata_biggersquats to fantasyfootballcoding [link] [comments]

2020.08.04 23:03 Aztechnology What information is technically required for a simulator. Is it possible to write an API layer for something like a Mevo in TGC?

More pertinent to my question. There's a lot of cheaper launch monitors like the mevo.
The mevo particularly calculates
-Carry distance
-Club head speed
-Ball speed
-Spin rate
-Launch angle
-Smash factor
-Apex height
-Flight time
If something like the opti-shot or R-Motion being able to simulate in software like TGC or E6. Then shouldn't the Mevo or some of these others be just as, if not more capable? The data is clearly being translated via API to an app. Long shot I know, but has anyone here tinkered around in TGC or various mid range launch monitors to see if we have access to those parameters/data? If it's possible I'd love looking into writing the software (software developer so have at least some background) to translate from something like the mevo to a simulator software as the other devices are significantly more expensive and hard to get hold of. Even if I wanted say a Skytrak or Mevo+ they are back ordered for months?
I might be completely off base with understanding how the game gets it's data to calculate a shot. But I have to imagine it more or less passes a package of variables to the game to use to simulate a shot?
P.S I know there's a golf simulator forum. I've signed up for it. But I cannot make threads until moderation reviews my account and it's been a few days with no action so trying my luck here first.
submitted by Aztechnology to golf [link] [comments]

2020.08.04 14:18 poodle_noodle42 Lenovo Thinkpad T14s Linux Review

TLDR: Everything works, from controlling the brightness of the display, to the fingerprint sensor, from suspend to RAM to HDMI and DP Output.

About this review:

This review is targeted at people who want to use their Thinkpad with Linux only. It doesn’t contains any comparison between Windows and Linux. Under every Headline you will find a short TLDR section which summarizes the content of the section. Fell free to skip sections you are not interested in and to ask questions. Please be kind about spelling and grammar mistakes since English is not my native language. If you find something wrong feel free to correct me.


The model I tested is from the Lenovo Campus Program has a Ryzen 7 4750u PRO Processor, 32GB Dual-Channel RAM, the 400 nits Full-HD low-power display and an one Terabyte NVME SSD.
I did all testing with Manjaro Cinnamon and Kernel 5.8rc5 which is the latest Kernel available in the Manjaro repository.

The Outside:

TLDR: It’s a Thinkpad.
What shall I say, it is a Thinkpad. It feels very solid with its magnesium body. The keyboard is one of the best you can get in this notebook class. Personally I think the touchpad is big enough and I really love having dedicated mouse buttons at the top. With 1.2kg its pretty light and with 1.6cm thin but not too thin. Overall a very solid outside experience.

The Display:

TLDR: Display is great. Everything works out of the box.
I got the 400nits low-power Full-HD Display and its great. It is bright enough to work outside. The colors are very balanced and natural probably even a bit better than on my LG IPS monitor. Personally I believe that Full-HD is enough for this display size. From the software side, the default scaling settings apply quite well for this size and resolution. Controlling the brightness using the FN-Keys also works out of the box.

CPU-Performance and cooling system.

TLDR: Performance is amazing, cooling system is good.
I used the following command lscpu grep Mhz together with the watch command to monitor the CPU frequency, as well as lm-sensors for temperature and fan speed. For comparison I have the Ryzen 5 2600 [[email protected]] in my desktop PC. In short tests like the ones included in hardinfo this laptop crushes my desktop CPU by boosting up to 4186MHz resulting in at least one third better results. While these short tests the fan is completely off, 0 RPM. Indeed when doing office, programming or Web browsing this laptop is cooled completely passive. In fact one can have up to 20 seconds of all core load before the fan starts spinning. To test the cooling system I compiled the complete Qt5 Framework from source for Windows with MXE – over an hour of all core load. While such a period of load the fan spins at around 3700RPM cooling the CPU to about 75-80°C. The processor it self stays at around 2400MHz making it a bit slower than my desktop CPU. While such a period the hole casing gets so warm on the bottom, that i would not recommend to have it on your lap. In this mode the fan is good to hear but not annoying. When the CPU gets a pause for 5 seconds it can than boost up to 2700MHz again for half a minute. Idling however, everything stays very cool with the CPU at 40-50°C clocking at around 1200-1400MHz.

Gaming Performance:

TLDR: It is a business notebook but you can still play games.
Do not expect much, it is still a business laptop with an iGPU but still I installed steam and the native port of The Witcher 2 and wall shall I say. It runs at 720p with high settings and AA off with 40-50 FPS in Arena Mode. It looks pretty impressing. But when you buy such a laptop and install Linux on it gaming probably is not your most desire.

Battery Live:

TLDR: Great at low loads, could be (and probably will be in the future) better when streaming video.
I installed TLP and TLPUI and tweaked the settings a bit, but just slightly. I wrote a little golang script to monitor the average power consumption, you will find it at the end of the post. Setting charging limits already works. Here are the results of my script. For calculating the battery live I assumed the laptop has a 57Wh battery as stated on the Lenovo site.
-2h Office (LibreOffice Writer, Firefox, Thunderbird, 1/3 screen brightness) 5.3W → 11.4h
-2h Video streaming (Chromium, Youtube 720p, lowest screen brightness) 9.5W → 6.3h
-2h Office (LibreOffice Writer, Firefox, Thunderbird, full screen brightness) 6.4W → 9.5h
The video runtime is a bit disappointing even though I installed VA-API and VDPAU drivers, enabled video-acceleration in chromium and used h264ify so that video acceleration should definitely work.
In my real live experience I could confirm these measurements and calculations, battery live is really good.

Other stuff:

Suspend to RAM:

Just works. In 6 hours the battery lost 8%.

SED – Self Encrypting Disk:

Works with the build in SSD. See here how to set it up so that it works with suspend to RAM.

Fingerprint Sensor:

Working. Use the instructions for the previous model:
The firmware to install is the same. You can get it with the Gnome-Firmware utility or directly with fwupd.

Video Output:

HDMI and DP-Alt worked out of the box.

WIFI and Bluetooth:

5 Ghz WIFI and Bluetooth Audio are working great. I could stream games from my desktop using moonlight with no latency.


I have the RJ45 adapter for the LAN Port on the the side, what should I say? It is working. It is LAN.


A very solid and robust business laptop with brutal CPU performance good battery and superb Linux compatibility.

Go Programm:

package main import ( "fmt" "os/exec" "strconv" "strings" "time" ) func main() { var sum float64 = 0 for i := 1; i > 0; i++ { cmdPowerConsumption := exec.Command("awk", "{print $1*10^-6}", "/sys/class /power_supply/BAT0/power_now") powerConsumtionBytes, err := cmdPowerConsumption.Output() if err == nil { s := string(powerConsumtionBytes) s = strings.TrimSuffix(s, "\n") powerConsumtionFloat, _ := strconv.ParseFloat(s, 64) sum += powerConsumtionFloat fmt.Printf("%f\r", sum/float64(i)) time.Sleep(1 * time.Second) } else { fmt.Println(err) } } } 
submitted by poodle_noodle42 to thinkpad [link] [comments]

Build a Calculator App in Ionic 4 – Beginner Tutorial Flask GUI Calculator  Python  Easy to Follow - YouTube How to build Calculator App in Angular 7 - Part 2 ASP.NET Web Service - Creating Simple Calculator create love test app in android studio Part 1 How to Create a Clear Button Calculator in Java #8 How to Add Real Time Currency Converter in Excel Sheet ... How To Create Love Calculator in Blogger Website with Free Script Step By Step 2020 Calculate distance and travel time between places using ... C# Calculator Step By Step Tutorial - YouTube

Love Calculator for Android - APK Download

  1. Build a Calculator App in Ionic 4 – Beginner Tutorial
  2. Flask GUI Calculator Python Easy to Follow - YouTube
  3. How to build Calculator App in Angular 7 - Part 2
  4. ASP.NET Web Service - Creating Simple Calculator
  5. create love test app in android studio Part 1
  6. How to Create a Clear Button Calculator in Java #8
  7. How to Add Real Time Currency Converter in Excel Sheet ...
  8. How To Create Love Calculator in Blogger Website with Free Script Step By Step 2020
  9. Calculate distance and travel time between places using ...
  10. C# Calculator Step By Step Tutorial - YouTube

Beginner friendly Ionic 4 tutorial about how to build apps with angular code. Building this simple calculator app with me in ionic 4 will give you an idea about how I do app development. Add and Calculate All Currencies in Real time in Excel Sheet also update currency.. Click here for more detail... #flask #easyflask #python We will be creating a simple calculator with Flask Python and with HTML for templates. BLOG POST (UPDATED) https://devwithnano.netl... In today's time we can see so many love calculators on google play store where user try to measure love by entering his/her name as well as his/her partner's name. In this tutorial I've tried to ... The second part for creating Calculator App in Angular The first part for creating Calculator App in Angular C# beginners tutorial on how to create a basic calculator in windows form application. video shows step by step development of the calculator.source code is ... Here is something fun and useful. Ever wanted to calculate distance between points, travel time using Excel? See a demo of the technique and visit https://ch... love calculator api, love calculator app android studio, love calculator android source code, love calculator by name, love calculator by date of birth, love calculator by name game, how to build a simple calculator how to build a tax calculator how to build calculator using c# how to build a calculator using arduino how to build a calculator using python how to build a ... Web Services Beginner Tutorial 1 - Introduction - What is a Web Service - Duration: 9:28. Automation Step by Step - Raghav Pal 400,863 views