Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "processor"
-
Boss: I need to demo our product but it looks smaller on my laptop.
Me: That is because you have a 1920x1080 monitor and your laptop is 1280x800
Boss: Is that something you can fix?
Me: No you will need a new laptop, but the company has a sales laptop with that resolution.
Boss: No just get the company credit card and buy me one today!
*Bosses son hears*
Bosses Son: Here take the sales laptop
Boss: Will that be quick enough
Bosses Son: It has a 8 core i7 Processor, 16GB ram and a dedicated GPU
Boss: *looks at me confused*
Me: Your demoing a web browser, that will be more then ok. But were using chrome so 16GB of ram will be pushing it.
*me and bosses son laughs*
Boss: Can we upgrade it?17 -
Whenever I come across some acronyms...
CD-ROM: Consumer Device, Rendered Obsolete in Months
PCMCIA: People Can’t Memorize Computer Industry Acronyms
ISDN: It Still Does Nothing
SCSI: System Can’t See It
MIPS: Meaningless Indication of Processor Speed
DOS: Defunct Operating System
WINDOWS: Will Install Needless Data On Whole System
OS/2: Obsolete Soon, Too
PnP: Plug and Pray
APPLE: Arrogance Produces Profit-Losing Entity
IBM: I Blame Microsoft
MICROSOFT: Most Intelligent Customers Realize Our Software Only Fools Teenagers
COBOL: Completely Obsolete Business Oriented Language
LISP: Lots of Insipid and Stupid Parentheses
MACINTOSH: Most Applications Crash; If Not, The Operating System Hangs10 -
We got a sever with two Intel Xeon E5520 processors and each processor has 8 cores and 2.27GHz.
Also the server has 36Gb of internal memory.
What do we do with it? We play Solitair 😎19 -
Friend: "Why did you buy a Macbook Pro? Look at the specs, the RAM, the storage, the processor.. heck, ain't it overpriced? I wouldn't if I were you"
Me: "No, I didn't buy it. My company gave it to me when I joined them."
Friend: "Oh.. okay... hey, is there any job opening in your company?"13 -
Got my new workstation.
Isn't it a beauty?
Rocking a Pentium II 366 MHz processor.
6 GB HDD.
64 MB SDRAM.
1 minute of battery life.
Resolution up to SXGA (1280x1024)
Removable CD-Rom drive.
1 USB port (we like to use dongles, right?)
Also it has state of the art security:
- No webcam
- No Mic
- Removable WiFi
- I forgot the password
And best of all:
It as a nipple to play with!!31 -
I want to stop charging my e-scooter at around 85% because this will increase the battery life. To avoid always having to pull the plug at the right level, I made a stop circuit that goes between charging brick and e-scooter.
There's no processor involved, just a CMOS 555 used as inverting Schmitt Trigger which controls a power mosfet. Also two status LEDs and a start switch. The poti adjusts the cut-off level. Worked on first try, with only manual voltage and tolerance calculations beforehand!27 -
When I was in 7th grade, my neighbor (a DoD programmer) challenged me to write a sorting algorithm for a hypothetical super limited environment (he said a satellite). It didn’t have any built-in sorting methods, had very limited memory, slow processor, etc. so I needed to be clever about it.
It took me a few nights before i found a solution he liked. The method I came up with counted the number of occurrences of each number in the array and put them in the appropriate spots in a new array. This way it only required O(2n) running time and 2n memory.
I just learned today that this is called the “counting sort” 😄
I’m proud of little 11 year old me.20 -
"Arch Linux is actually not that difficult".
I ssh'ed into my home server yesterday.
I was greeted by a message from an ext3 disk about needing fsck. Fine, "I haven't been in here for a while, might as well do some maintenance". fsck /dev/sda6, let's go!
This nicely "repaired" the sshd service (i.e. cleared the sectors), I cursed at myself for pressing enter at "repair (y)" right before the connection broke.
So I connected a display and keyboard... ok so let's just pacman -Sy sshd or whatever. We can do this! Just check the wiki, shouldn't be that hard!
Wait... pacman has not run since 2010? WAIT IT'S ACTUAL UPTIME IS 9 YEARS??? I guess we know why I'm a DB admin and not devops...
Hmm all the mirrors give timeouts? Oh. The i686 processor architecture isn't even supported anymore...?
4 hours, 11 glasses of cognac, 73 Arch32 wiki/forum pages, 2 attempts at compiling glibc, and 4 kernel panics later: "I think I'll buy a new server".16 -
THIS is why unit testing is important, I often see newbs scour at the idea of debugging or testing:
My high school cs project, i made a 2d game in c++. A generic top down tank game. Being my FIRST project and knowing nothing about debugging or testing and just straight up kept at it for 3 months. Used everything c++ and OOP had to offer, thinking "It works now, sure will work later"
Fast forward evaluation day i had over 5k lines of code here, and not a day of testing; ALL the bugs thought to themselves- "YOU KNOW WHAT LETS GUT THIS KID "
Now I did see some minor infractions several times but nothing too serious to make me refactor my code. But here goes
I started my game on a different system, with a low end processor about 1/4 the power of mine( fair assumption). The game crashed in loading screen. Okay lets do that again. Finally starts and tanks are going off screen, dead tanks are not being de-spawned and ended up crashing game again. Wow okay again! Backround image didn't load, can only see black background. Again! Crashed when i used a special ability. Went on for some time and i gave up.
Prof saw the pain, he'd probably seen dis shit a million times, saw all the hard work and i got a good grade anyways. But god that was embarrassing, entire class saw that and I cringe at the thought of it.
I never looked at testing the same way again.6 -
Finally, I've upgraded my processor. From single core 2.2 to ... wait for it ... dual core 2.7!!!!!!!!!!! Hooray12
-
Larry Tesler, a computer scientist who created the terms "cut," "copy," and "paste," has passed away at the age of 74 (17 Feb 2020).
In 1973, Tesler took a job at the Xerox Palo Alto Research Center (PARC) where he worked until 1980. Xerox PARC is famously known for developing the mouse-driven graphical user interface and during his time at the lab Tesler worked with Tim Mott to create a word processor called Gypsy that is best known for coining the terms "cut," "copy," and "paste".
In addition to "cut," "copy," and "paste" terminologies, Tesler was also an advocate for an approach to UI design known as modeless computing. It ensures that user actions remain consistent throughout an operating system's various functions and apps. When they've opened a word processor, for instance, users now just automatically assume that hitting any of the alphanumeric keys on their keyboard will result in that character showing up on-screen at the cursor's insertion point. But there was a time when word processors could be switched between multiple modes where typing on the keyboard would either add characters to a document or alternately allow functional commands to be entered.10 -
About two years ago I get roped into a something when someone was requesting an $8000 laptop to run an "program" that they wrote in Excel to pull data from our mainframe.
In reality they are using our normal application that interacts with the mainframe and screen scrapping it to populate several Excel spreadsheets.
So this guy kept saying that he needed the expensive laptop because he needed the extra RAM and processing power for his application. At the time we only supported 32 bit Windows 7 so even though I told him ten times that the OS wouldn't recognize more than 3.5 GB of RAM he kept saying that increasing the RAM would fix his problem. I also explained that even if we installed the 64 bit OS we didn't have approval for the 64 bit applications.
So we looked at the code and we found that rather than reusing the same workbook he was opening a new instance of a workbook during each iteration of his loop and then not closing or disposing of them. So he was running out of memory due to never disposing of anything.
Even better than all of that, he wanted a faster processor to speed up the processing, but he had about 5 seconds of thread sleeps in each loop so that the place he was screen scrapping from would have time to load. So it wouldn't matter how fast the processor was, in the end there were sleeps and waits in there hard coded to slow down the app. And the guy didn't understand that a faster processor wouldn't have made a difference.
The worst thing is a "dev" that thinks they know what they are doing but they don't have a clue.7 -
"Schrödingers CPU": When your processor randomly and for no apparent reason spins up to 100% and brings the whole system to a halt. Bringing up task manager takes forever,
and when it finally appears so you could see what process is the culprit, it quickly drops back to normal again leaving you with no clue on what happened.7 -
College be like
"Today you are going to build an ARM processor, questions ?"
"Yeah, how do we do ?"
"It's not my business" -
Anyone else heard of this cool little device? (GPD Win, an obscure Chinese computer that is like a Nintendo DS but it has an x64 processor and runs full windows 10)18
-
Here are the reasons why I don't like IPv6.
Now I'll be honest, I hate IPv6 with all my heart. So I'm not supporting it until inevitably it becomes the de facto standard of the internet. In home networks on the other hand.. huehue...
The main reason why I hate it is because it looks in every way overengineered. Or rather, poorly engineered. IPv4 has 32 bits worth, which translates to about 4 billion addresses. IPv6 on the other hand has 128 bits worth of addresses.. which translates to.. some obscenely huge number that I don't even want to start translating.
That's the problem. It's too big. Anyone who's worked on the internet for any amount of time knows that the internet on this planet will likely not exceed an amount of machines equal to about 1 or 2 extra bits (8.5B and 17.1B respectively). Now of course 33 or 34 bits in total is unwieldy, it doesn't go well with electronics. From 32 you essentially have to go up to 64 straight away. That's why 64-bit processors are.. well, 64 bits. The memory grew larger than the 4GB that a 32-bit processor could support, so that's what happened.
The internet could've grown that way too. Heck it probably could've become 64 bits in total of which 34 are assigned to the internet and the remaining bits are for whatever purposes large IP consumers would like to use the remainder for.
Whoever designed IPv6 however.. nope! Let's give everyone a /64 range, and give them quite literally an IP pool far, FAR larger than the entire current internet. What's the fucking point!?
The IPv6 standard is far larger than it should've been. It should've been 64 bits instead of 128, and it should've been separated differently. What were they thinking? A bazillion colonized planets' internetworks that would join the main internet as well? Yeah that's clearly something that the internet will develop into. The internet which is effectively just a big network that everyone leases and controls a little bit of. Just like a home network but scaled up. Imagine or even just look at the engineering challenges that interplanetary communications present. That is not going to be feasible for connecting multiple planets' internets. You can engineer however you want but you can't engineer around the hard limit of light speed. Besides, are our satellites internet-connected? Well yes but try using one. And those whizz only a couple of km above sea level. The latency involved makes it barely usable. Imagine communicating to the ISS, the moon or Mars. That is not going to happen at an internet scale. Not even close. And those are only the closest celestial objects out there.
So why was IPv6 engineered with hundreds of years of development and likely at least a stage 4 civilization in mind? No idea. Future-proofing or poor engineering? I honestly don't know. But as a stage 0 or maybe stage 1 person, I don't think that I or civilization for that matter is ready for a 128-bit internet. And we aren't even close to needing so many bits.
Going back to 64-bit processors and memory. We've passed 32 bit address width about a decade ago. But even now, we're only at about twice that size on average. We're not even close to saturating 64-bit address width, and that will likely take at least a few hundred years as well. I'd say that's more than sufficient. The internet should've really become a 64-bit internet too.34 -
*When your processor is left with no power to create a process to kill the process that's consuming all your processor's processing power.3
-
Okay lets write this before i go mad...
I'm one of those guys who says "use the os which suits you the most, or you're most familiar with", and i'v always been a windows guy, didn't really have any reason to use linux, because for school stuff, or programming (java and android and c) windows was great enough...
BUT MOTHERF@CKERS at microsoft, i'v had enough...
First my handheld computer goes nuts, because windows is eating 80% of processor, and if i fix it, then some other kind of windows related thing eats up that much, and you know what? I've been okay with that, because thats only a handheld computer, but boy, didn't my main computer start to do the same?!?
I cannot do anything, basically i start something trivial up (by trivial i mean trivial, like idk, a texteditor not even a browser, or an ide or anything that would take a bit of more ram) and my computer cant do shit....
I'm so mad.... Currently installing elemantary os... F@ck this shit i'm out...
(And lets not forget the hours of 'updates' which dont do shit....)13 -
I have a laptop which I bought for the sole purpose of gaming and I bought a hell lot of games off steam to startup.
But the problem here is I have to run those games on Windows (nvidia graphics card) and I only have a primary hdd and no ssd. Even though the ram and processor is up to speed due to high I/O on disk, I am not able to get a good performance out of it. To top it off random Windows processes hogging hdd in the background.
Any suggestions on what to do?3 -
Been reviewing ALOT of client code and supplier’s lately. I just want to sit in the corner and cry.
Somewhere along the line the education system has failed a generation of software engineers.
I am an embedded c programmer, so I’m pretty low level but I have worked up and down and across the abstractions in the industry. The high level guys I think don’t make these same mistakes due to the stuff they learn in CS courses regarding OOD.. in reference how to properly architect software in a modular way.
I think it may be that too often the embedded software is written by EEs and not CEs, and due to their curriculum they lack good software architecture design.
Too often I will see huge functions with large blocks of copy pasted code with only difference being a variable name. All stuff that can be turned into tables and iterated thru so the function can be less than 20 lines long in the end which is like a 200% improvement when the function started out as 2000 lines because they decided to hard code everything and not let the code and processor do what it’s good at.
Arguments of performance are moot at this point, I’m well aware of constraints and this is not one of them that is affected.
The problem I have is the trying to take their code in and understand what’s its trying todo, and todo that you must scan up and down HUGE sections of the code, even 10k+ of line in one file because their design was not to even use multiple files!
Does their code function yes .. does it work? Yes.. the problem is readability, maintainability. Completely non existent.
I see it soo often I almost begin to second guess my self and think .. am I the crazy one here? No. And it’s not their fault, it’s the education system. They weren’t taught it so they think this is just what programmers do.. hugely mundane copy paste of words and change a little things here and there and done. NO actual software engineers architecture systems and write code in a way so they do it in the most laziest, way possible. Not how these folks do it.. it’s like all they know are if statements and switch statements and everything else is unneeded.. fuck structures and shit just hard code it all... explicitly write everything let’s not be smart about anything.
I know I’ve said it before but with covid and winning so much more buisness did to competition going under I never got around to doing my YouTube channel and web series of how I believe software should be taught across the board.. it’s more than just syntax it’s a way of thinking.. a specific way of architecting any software embedded or high level.
Anyway rant off had to get that off my chest, literally want to sit in the corner and cry this weekend at the horrible code I’m reviewing and it just constantly keeps happening. Over and over and over. The more people I bring on or acquire projects it’s like fuck me wtf is this shit!!! Take some pride in the code you write!16 -
I have a whatsapp group with my friends, none of which are techies. A while ago one of them was looking for a phone to buy, so he started looking at models, specs and all that, but got pretty confused and asked a pretty well-informed question to the group:
"Guys, what is that quad core thing?
And what is a RAM? Is it something like the processor of the phone or what? "
OK, pretty typical stuff up until this point. The guy knows nothing about this sort of things, I wouldn't criticize him or insult him or anything like that. No, that's not the problem. The problem is the person that responded to him. This... This melted my brain so much I will never forget:
"Don't worry about that, you only have to look at how many gigahertz does the processor run at. Don't worry about the number of cores or ram. The GHz are the result of the amount of ram and cores, so the more the gigahertz, the better the phone."
PD: "Also take a look at how many megapixels does the camera have if you want to take photos".
Some people just talk out of their ass and pretend like they're experts on any topic they've read about for 5 minutes on the Internet7 -
Sooo win has updated itself (in a short 6 hours) and I was wondering what eats my pocket pc's 4 gb ram, so i checked, and turns out system is eating 40-50% of it and its processor ever since the update... *Sigh* Awesome....
(Picture src: tumblr/just-shower-thoughts --> [in reality] r/showerthoughts)7 -
I'm a freelance web developer and I normally work on small to medium sized websites, 9 out 10 times based on WordPress and 10 out 10 times with a limited budget.
8 out of 10 times the sites content will be updated by someone with at best casual knowledge in website management.
Say what you will about WP but it's my bread and butter and it works great for just these kinds of websites; where the cost is a dealbreaker and the end product should be as user friendly as a standard word processor.
No, you probably wouldn't build a control panel for the next space shuttle or an online bank in WordPress, but I rarely need to concern myself with those kinds of projects so that really doesn't affect me.
Pretty much the same reason I have a Kia car even though I wouldn't win a Formula 1 race with it.
I for one am grateful that there's an open source tool available to my clients that more than adequately meets their needs (that's also fun to work with and build custom solutions on for me as a developer).7 -
I was on vacation when my employer’s new fiscal year started. My manager let me take vacation because it’s not like anything critical was going to happen. Well, joke was on us because we didn’t foresee the stupidity of others…
I had to update a few product codes in the website’s web config and deploy those changes. I was only going to be logged in for 30 minutes to complete that.
I get messaged by one of our database admins. He was doing testing and was unable to complete a payment on the website. That was strange. There was a change pushed by our offsite dev agency, but that was all frontend changes (just updating text) and wouldn’t affect payments.
We don’t want to enlist the dev agency for debugging work, especially when it’s not likely that it’s a code issue. But I was on vacation and I couldn’t stay online past the time I had budgeted for. So my employer enlists the dev agency for help. It’s going to be costly because the agency is in Lithuania, it was past their business hours, and it was emergency support.
Dev agency looks at error logs. There are Apple Pay errors, but that doesn’t explain why non Apple Pay transactions aren’t going through. They roll back my deployment and theirs, but no change. They tell my employer to contact our payment processor.
My manager and the Product Manager contact Payroll, who is the stakeholder for our payment gateways. Payroll contacts our payment gateway and finds out a service called Decision Manager was recently configured for our account. Decision Manager was declining all payments. Payroll was not the person who had Decision Manager installed and our account using this service was news to her.
Payroll works with our payment processor to get payments working again. The damage is pretty severe. Online payments were down for at least 12 hours. Our call center had logged reports from customers the night before.
At our post mortem, we had to find out who ok’d Decision Manager without telling anyone. Luckily, it was quick work. The first stakeholder up was for the Fundraising Dept. She said it wasn’t her or anyone on her team. Our VP of Analytics broke it to her that our payment processor gave us the name of the person who ok’d Decision Manager and it was someone on the Fundraising team. Fundraising then starts backtracking and says that oh yes she knew about it but transactions were still working after the Decision Manager had been configured. WTAF.
Everyone is dumbfounded by this. How could you make a big change to our payment processor and not tell anyone? How did our payment processor allow you to make this change when you’re not the account admin (you’re just a user)?
Our company head had to give an awkward speech about communication and how it’s important. The web team can’t figure out issues if you don’t tell us what you did. The company head was pissed because it was a shitty way to start off the new fiscal year. Our bill for the dev agency must have been over $1000 for debugging work that wasn’t helpful.
Amazingly, no one was fired.4 -
So I'm looking to buy a drone for my internship company to find people during floods. And damn these companies suck balls.
Closed source.
You want to use API for onboard image processing?
Buy a €3500 drone
Add €1100 processor stuff
Add €850 camera
ugh.16 -
When the new iPhone has "The highest pixel density in an iPhone yet" and "Most ambitious chip ever in an iPhone", like they'd make a new iPhone with a slower processor and a worse display?6
-
Help.
I'm a hardware guy. If I do software, it's bare-metal (almost always). I need to fully understand my build system and tweak it exactly to my needs. I'm the sorta guy that needs memory alignment and bitwise operations on a daily basis. I'm always cautious about processor cycles, memory allocation, and power consumption. I think twice if I really need to use a float there and I consider exactly what cost the abstraction layers I build come at.
I had done some web design and development, but that was back in the day when you knew all the workarounds for IE 5-7 by heart and when people were disappointed there wasn't going to be a XHTML 2.0. I didn't build anything large until recently.
Since that time, a lot has happened. Web development has evolved in a way I didn't really fancy, to say the least. Client-side rendering for everything the server could easily do? Of course. Wasting precious energy on mobile devices because it works well enough? Naturally. Solving the simplest problems with a gigantic mess of dependencies you don't even bother to inspect? Well, how else are you going to handle all your sensitive data?
I was going to compare this to the Arduino culture of using modules you don't understand in code you don't understand. But then again, you don't see consumer products or customer-specific electronics powered by an Arduino (at least not that I'm aware of).
I'm just not fit for that shooting-drills-at-walls methodology for getting holes. I'm not against neither easy nor pretty-to-look-at solutions, but it just comes across as wasteful for me nowadays.
So, after my hiatus from web development, I've now been in a sort of internet platform project for a few months. I'm now directly confronted with all that you guys love and hate, frontend frameworks and Node for the backend and whatever. I deliberately didn't voice my opinion when the stack was chosen, because I didn't want to interfere with the modern ways and instead get some experience out of it (and I am).
And now, I'm slowly starting to feel like it was OKAY to work like this.7 -
First dev job: port Unix on Transputer, a (now defunct) bizarre processor with no stack, no registers and no compiler. That was fun! And that was in 1991 😎3
-
Being a user, u watch your processor handle things...! 😪
Become a superuser 🤓
Processor watches you handle everything 😎7 -
Yesterday I completed a transactions module that used an external payment processor, similar to PayPal. It was hard, but after few hours of trying out different options I finally managed to get it to work.
I decided to create a simple prototype UI without any styling just to show my progress to the manager and let him know that it's working.
His response? "yeah, that seems to work, but that UI is terrible and not appealing at all. Change that immediately and try to add more thought into your design"
I guess I won't be making prototypes any time soon6 -
Ability to complaining about "Gradle is too slow in my laptop!" to my dad to get a better one.
Saying thanks to Google.3 -
We had a sprint where we removed some fields from the signup page, in order not to "scare users off" with the amount of information requested. Quite a few changes in frontend and backend alike.
Only now in the final day of the sprint (where we're supposed to deploy the changes) do we realize some of that information is actually required by the payment processor, and likely for very predictable *legal* reasons which I even questioned during planning.15 -
Had an interview with a potential customer last week, and he started questioning my technical capability in the middle of the discussion on the basis that I’m taking notes with pen and paper...
Yes, I can type. At 90+ WPM, I can darn near produce a transcript of everything we say. But I won’t remember any of it afterward, because it passes straight from the ears to the hands without any processing.
“You see, that’s what we have something called ’search’ for...”
...Yeah. Except that doesn’t help with picking out the most important points from a wall of text, organizing it in a way that allows visualizing relationships between concepts, and other non-linear things that are hard to do on the fly in a word processor.
“Well, how about we get you a tablet with a pen and you can just write on that, then?”
How about no.
Ended up turning him down because of other concerns that were raised that were, suffice to say, about as ornerous as you might expect from that exchange.7 -
One thing I learned over the years is that even when you think you can't do something or don't have the strength to do it, you actually can.
People do nothing better than to make excuses for themselves or blame others for the things they did without even considering that they could have done something about it.
The brain is a powerful processor to the point that when you think you're sick constantly your body will react accordingly.
Thing is though. If you don't take the opportunities that present themselves or don't look for them you'll probably get nowhere to the point where it could lead to depression.
Sure enough failures and mistakes happen all the time, ardly anything will go right the first time possibly leading to becoming demotivated and sometimes even depression.
Why? Because you forgot to think "what can I improve the next time"
A co-worker of mine keeps going back to his project he's working on because the boss has something in mind but somehow fails to translate it to him. He never stops to think what the desired functionality is compared to what it should do or look like (UI/UX). Eventually he snaps blaming the boss that he had to change it a couple of times.
This has happened multiple times since I started my Internship to the point where it just starts to irritate me.
Of course it's not always your fault but there are plenty of cases where it is or where you could have prevented it.
Mistakes and failures make you stronger only if you want to learn from them.
Have a good day -
I wanted an Android phone with latest Oreo version, atleast 4gb RAM, 32gb or more with good processor supporting 4G dual SIM with a resolution of 1440 pixels by 2560 pixels at US$ 100.
Well got for all of it for US$125.
I went to LineageOS site, exported and filtered all supported devices with my relevant device attributes.
Got LeEco LeMax 2 for US125$ on eBay. Installed latest LineageOS 15.1 with Android 8.1.0. got 25gb free space.
Using the phone for last 10 days, flawlessly.22 -
I enjoy watching the Microsoft events, as they always introduce something completely new, that no one's really made before. Unlike certain companies *cough* Apple *cough* who just slap a better processor on their existing devices and calls it "revolutionary". I like all this innovation7
-
Amdy's story.
Amdy didn't have it easy. He's just a little APU and was already outdated when he was manufactured. But it got even worse! He didn't do anything wrong, but upon assembly, they lasered a different part number on him.
He didn't think much about it, but then they denied him all the goodies his brothers got: a nice printed box, a cooler, a leaflet, and a sticker.
Amdy didn't get any of that and wasn't welcome in the boxed camp. Instead, they stuffed him into a shoddy tray cardboard box with just some ESD foam for the pins.
Amdy was disappointed. That was just not fair! He was capable like his brothers. To add insult to injury, not even the manufacturer wanted to give warranty on the poor ugly duckling. They didn't listen to his complaints and shipped him to an unknown fate.
Then our roads crossed because Amdy was 10 EUR cheaper than the boxed ones at that point. Little Amdy breathed heavily when he finally got out of the mini box and seemed a bit disoriented. Poor little sod, what did they do to you?
Then he spotted the cooler. He had never seen anything like this before, so much better than the coolers his boxed brothers had received! And even top of the line thermal paste!
Amdy decided to be as good and fast a processor as a small Zen+ APU could possibly be. What was that software stuff? Didn't look like Windows. Ooohhh - Amdy rejoiced when he figured out that he was supposed to run Linux!
And that's how a despaired and unhappy APU finally found a life full of goodness.6 -
One of the more memorable computer problems I solved were when I added some lego blocks to solve a recurring windows bluescreen
A friend had a Pentium 3 (slot 1) that kept throwing him several bluescrens per day so I decided to help
I open up the computer and saw that the processor were not properly securred in it's place and the plastic pieces that should have holding it in place were gone, so I improvised pressing in some lego pieces that I found somewhere to secure that the processor didn't move if someone were walking close to the computer and after that he didn't have any more bluescreens than the rest of us4 -
I just discovered that my dedicated server is a tamagotchi laying around at somebody's home. While benchmarking cpu I discovered that it performed worse than my 4yr old chromebook with mediatek processor.4
-
Our university labs still use computers with 512mb ram, celeron processor for programming and networking courses. Even worse some of the mouse/keyboards/monitors not working and we occasionally have to do exam on those machines ...5
-
Chrome is getting its ass whooped and then crying like a whiny bitch.
Using 90.1% of CPU.
PS: I have a i7 8th gen processor.20 -
Ahahaha
More of a suprise.
Just by error double clicked on WINDOWS machine on a BASH (.sh) script.
Welp, some randon bash processor appered and script was executed correctlly.
I almost shit my pants, it's a script which changes production env.
I was expecting a notepad lol9 -
MacRant: was waiting for a new macbook pro release for awhile to upgrade by old laptop (not mac). Watched the release, had very mixed feeling about it, but still ordered (clinching my teeth and saying sorry to my wallet). Next day looked into alternatives, cacelled the orded to have more time to think, now deciding... I mean cmon, no latest 7th gen processor, no 32gb memory option, 2gb video is ok for non gaming, the whole "big" thing is TouchBar that I DON'T F* NEED. They should drop the "Pro" and name it "Fancy Strip".
So I looked into alternatives, and Dell XPS 15 with maxed spect is twice as juicier, and has not a touch bar, but the whole touch freakin 4k screen, for the less price :/
Just wanted to rant about the new macbook's spec and price and see what you all think of macbook vs alternatives?16 -
So, I decided to post this based on @Morningstar's conundrum.
I'm dissatisfied with the laptop market.
Why THE FUCK should I have to buy a gaming laptop with a GTX 1070 or 1080 to get a decent amount of RAM and a fucking great processor?
I don't game. I program. I don't even own a fucking Steam library, for clarification. Never have I ever bought a game on Steam. Disproving the notion that I might have a games library out of the way, I run Linux. Antergos (Arch-based) is my daily driver.
So, in 2017 I went on a laptop hunt. I wanted something with decent specs. Ultimately ended up going with the system76 Galago Pro (which I love the form factor of, it's nice as hell and people recognize the brand for some fucking reason). Matter of fact, one of my profs wanted to know how I accessed our LMS (Blackboard) and I showed him Chromium....his mind was blown: "Ir's not just text!"
That aside, why the fuck are Dell and system76 the only ones with decent portables geared towards developers? I hate the prospect of having to buy some clunky-ass Republic of Gamers piece of shit just to have some sort of decent development machine...
This is a notice to OEMs: yall need to quit making shit hardware and gaming hardware with no mid-range compromise. Shit hardware is defined as the "It runs Excel and that's all the consumer needs" and gaming hardware is "Let's put fucking everything in there - including a decent processor, RAM, and a GTX/Radeon card."
Mid-range that is true - good hardware that handles video editing and other CPU/RAM-intensive tasks and compiling and whatnot but NOT graphics-intensive shit like gaming - is hard to come by. Dell offers my definition of "mid-range" through Sputnik's Ubuntu-powered XPS models and what have you, and system76 has a couple of models that I more or less wish I had money for but don't.
TBH I don't give two fucks about the desktop market. That's a non-issue because I can apply the logic that if you want something done right, do it yourself: I can build a desktop. But not a laptop - at least not in a feasible way.23 -
Today was a day at work that I felt like I made a significant contribution. It was not a lot of code. Actually it was a difference of 3 characters.
I am developing an industrial server so that my employer can provide access to their machines to enterprise industrial systems. You know, the big boys toys. Probably in fucking java...
Anyway, I am putting this server on an embedded system. So naturally you want to see how much serving a server can serve. In this case the device in more processor starved than memory starved. So I bumped up the speed of the serving from 1000mS to 100mS per sample. This caused the processor to jump from 8% of one core (as read from top) to 70%. Okay, 10x more sampling then 10x approx cpu usage. That is good. I know some basic metrics for a certain amount of data for a couple of different sampling rates.
Now, I realized this really was not that much activity for this processor. I mean, it didn't seem to me that it "took much" to see a large increase of processor usage. So I started wondering about another process on the system that was eating 60 to 70 % all the time. I know it updated a screen that showed some not often needed data from its display among controlling things. Most of the time it will be in a cabinet hidden from the world. I started looking at this code and figured out where the display code was being called.
This is where it gets interesting. I didn't write this code. Another really good programmer I work with wrote this. It also seemed to be pretty standard approach. It had a timer that fired an event every 50mS. This is 20 times per second. So 20 fps if you will. I thought, What would happen if I changed this to 250mS? So I did. It dropped the processor usage to 15%! WTF?! I showed another programmer: WTF?! I showed the guy who wrote it: WTF?! I asked what does it do? He said all it does it update the display. He said: Lets take to 1000mS! I was hesitant, but okay. It dropped to 5%!
What is funny is several people all said: This is running kinda hot. It really shouldn't be this hot.
Don't assume, if you have a hunch, play with it if its safe to do so. You might just shave off 55 to 60 % cpu usage on your system.
So the code I ended up changing: "50" to "1000".16 -
I am building a PC for my first time and thought about every step more than twice. This is going to be my build:
Processor: AMD Ryzen™ 7 2700
Mainboard: X370
RAM: Corsair DIMM 8 GB DDR4-2400
Video card: Zotac NVIDIA GeForce GTX 1050 Ti
SSD: Samsung 960 EVO 250 GB
HDD: Seagate ST1000DM010 1 TB
PSU: PURE POWER 10 | 300W
Case: Aerocool Cylon RGB Midi-Tower - black
What are your opinions on this build?59 -
Friend : hey! I wanna buy a laptop.. range is about entry level nothing hi fi! But it should work for 3-4 years.
Me : sure.. give me a few hours..i'll get back.
*Looks all around foe the best thing in that price range.
*Sends a list of laptops ranked based on value for money.
Friend : bought it! Yay! 😎😎😎
*Buys the shittiest laptop they could find at that price range with an absolute old age processor.
Why the fuck did you even ask me at the first place? Fucked couple of hours for me.6 -
Found this little gem in the AMD64 reference manual:
"When PCIDs are enabled the system software can store 12-bit PCIDs in CR3 for different address spaces. Subsequently, when system software switches address spaces (**by writing the page table base pointer in CR3[62:12]**), the processor **may use TLB mappings previously stored for that address space and PCID**".
later:
"Updates to the CR3 register cause the entire TLB to be invalidated except for global pages."
So let me get this straight: PCIDs allow you to reuse TLB entries (instead of flushing the entire TLB) when writing a new address space to CR3 but writing to CR3 always flushes the entire TLB anyways
Just why 🤦♂️7 -
Short story for the one interessed in the image: when we change idea we change the whole idea. And it is likely to happen very often. Sometimes twice a day, every day, for a week.
Long stort:
I am hopeless:
I am an IT university student, i know how to program and how to search for a fucking manual, but i am dealing with eletronics and PCB...
I have to make the firmware for a board (atmel things) and it have to talk via spi with some other devices (it is slave of one, and master for all the others(i will use two spi channels)), this should be easy...
I am have no senior to ask to, all i have is google and i found problems in every thing i try to do, every - fucking - single - one!!!! I know that the solution is always of the "you have to plug it in" type, but
NEITHER GOOGLE IS BEING OF HELP!
Let me explain this morning pain:
i can't add libraries in atmel studio, something wrong with the asf wizard, i have only found a tutorial that says what buttons press to solve my problem... I DO NOT HAVE THIS BUTTONS!!!
And the library i wanted to add is the one to make the board talk with the computer on his COM port... (And have some debug message...)
And the wizard gives problem because i created the project using an online atmel tool...
YES, i tried to create a project with asf and then add the files given by the online tool.... THEY DO NOT COMPILE, I SHOULD HAVE TO MESS WITH A 400 LINES LONG MAKEFILE, that is anything but human readable...
I haven't even look for anything spi related this morning
I am even forced to use windows, because every question in the forums, or every noobbish tutorial is based on it...
And then i find the tutorial with the perfect title, holy shit this is the thing i truly need!!!!! It says how to open a file. And then stops. WHAT ABOUT THE THING YOU WERE TALKING ABOUT IN THE TITLE??????
This project is the upgrade of a glue-pump based on an atmega328 (arduino uno processor), that is currently being produced and sold by our "company" .... .... That is composed by me and the boss.
He is a very nice and and smart person, he tries to give me ideas for the solution, if i cannot find out how to do something we can even change a lot of specifics of the project (the image shows our idea-change) and every board has some weeks of mornings like the one described above (i work only in the morning).
I am learning a very lot of things...
But the fact that every thins i try fails is destroying me, what would you do in my place?
Ps. Lot lf love for the ones who made it until the end <36 -
So my wife bought a really old android tablet (it's on gingerbread lol) so I've decided to bring it up to android pie, yes that means building a custom ROM, from scratch, for a 7 year old device. I will be documenting my progress and if I fail then at least it will be published research as the memory optimization in android pie is so much smoother now it should be good. If it fails I shall try to build android go to the device.
It's still got a 1.5ghz processor and 1gb of ram which should be fine so here's hoping.23 -
From Sarah Connor Chronicles, 2008: "They used to think that 12 nanometer scale was impossible. The circuits are so tiny, you're all but in the quantum realm. It's the most sophisticated processor on earth. If you could take your memories, your consciousness, everything that makes you a person, turn it into pure data, and download it onto a machine, that chip could run it."
I'm watching the DVD on a quadcore Ryzen APU that is built in 12nm, and it was already outdated when I bought it last year. I guess I better download myself to my laptop because that's a 7nm Ryzen.14 -
Running code in a JVM ... which is a virtual machine...
Inside a VM that runs Linux...
Inside a host OS that runs on native...which runs on a CISC processor... that internally runs a RISC architecture... so that makes the CISC a VM...
The RISC architecture I am pretty sure runs on Elf Magic... I am fairly certain Turing was an Elf working for Santa...
So I am really running my code on VM Elf Magic9 -
Dear brain, could u please work?
"No you motherfugging arsehole, scratch the sand out of your vagina and make yourself your own processor. Fuck u."
Seems like it's the jolly season of "my brain is uncooperative and unwilling".1 -
What a new years start..
"Kernel memory leaking Intel processor design flaw forces Linux, Windows redesign"
"Crucially, these updates to both Linux and Windows will incur a performance hit on Intel products. The effects are still being benchmarked, however we're looking at a ballpark figure of five to 30 per cent slow down"
"It is understood the bug is present in modern Intel processors produced in the past decade. It allows normal user programs – from database applications to JavaScript in web browsers – to discern to some extent the layout or contents of protected kernel memory areas."
"The fix is to separate the kernel's memory completely from user processes using what's called Kernel Page Table Isolation, or KPTI. At one point, Forcefully Unmap Complete Kernel With Interrupt Trampolines, aka FUCKWIT, was mulled by the Linux kernel team, giving you an idea of how annoying this has been for the developers."
>How can this security hole be abused?
"At worst, the hole could be abused by programs and logged-in users to read the contents of the kernel's memory."
https://theregister.co.uk/2018/01/...22 -
I asked one of my engineering classmate which processor they had in their laptop.
Ans : 3GB.
I dont know whether they dont know a shit about computers or they are too bad at english.10 -
First year: intro to programming, basic data structures and algos, parallel programming, databases and a project to finish it. Homework should be kept track of via some version control. Should also be some calculus and linear algebra.
Second year:
Introduce more complex subjects such as programming paradigms, compilers and language theory, low level programming + logic design + basic processor design, logic for system verification, statistics and graph theory. Should also be a project with a company.
Year three:
Advanced algos, datastructures and algorithm analysis. Intro to Computer and data security. Optional courses in graphics programming, machine learning, compilers and automata, embedded systems etc. ends with a big project that goes in depth into a CS subject, not a regular software project in java basically.4 -
Must nearly every recently-made piece of software be terrible?
Firefox runs terribly slowly on a four-core 1.6GHz processor when given eight (8) gigabytes of RAM. Discord's user interface is awfully slow and uses unnecessary animations. Google's stuff is just falling apart; a toaster notification regarding MRO stock was recently pushed such that some markup elements of this notification were visible in the notification, the download links which are generated by Google Drive have sometimes returned error 404, and Google's software is overall sluggish and somewhat unstable. Today, an Android phone failed to update the Google Drive application... and failed to return a meaningful error message. Comprehensive manuals appear to be increasingly often not provided. Microsoft began to digest Windows after Windows XP was released.
Laziness is not virtuous.
For all computer programs, a computer program should be written such that this computer program performs well on reasonably terrible hardware... and kept simple. The UNIX philosophy is woefully underappreciated.37 -
They all want to make games like this, and minimum requirements are:
Processor: Intel Core i5 3470 @ 3.2GHZ (4 CPUs) / AMD X8 FX-8350 @ 4GHZ (8 CPUs)
Memory: 8GB.
Video Card: NVIDIA GTX 660 2GB / AMD HD7870 2GB.
Sound Card: 100% DirectX 10 compatible.
HDD Space: 65GB.3 -
Well, here is another Intel CPU flaw.
I'm starting to think that all these were done on purpose...
https://thehackernews.com/2019/05/...3 -
!rant
My laptop just died for the third time.
Need to buy a new one. Not a mac.
Linux machine with 15" screen, sdd, usb c, with 8gb ram and a graphics processor.
Suggestions?29 -
Rant:
I am at work, some one says to me this system we are working on is multi threaded. I tell the no its not multi threaded and in this context. Things cannot happen concurrently. Its a single core arm 7tdmi. Arguments ensue abot the difference between multithread multitasking an multiprocessing. I proceed to explain this is a multitasking interrupt driven system. With no context switching or memory segmentation so one heap for all tasks cause thats how we have it configured and there is only one core. So there is no way the error he just described could possibly happen. Then he tells me im wrong but refuses to even look at the processor manual and rejects the Wikipedia entry for multithreading. So I plan on calling off so i can just have the next two weeks off while he trys to figure out why two things ar happening at once on this system. He deserves all the frustration that is to follow.1 -
For the first time ever, I locked up a processor while working. Take that, 24 cores!
Unrelatedly, if someone is in the office, could you please power cycle my box? ...Thanks.2 -
Because I own http://grnail.co.uk and http://hotrnail.co.uk (which I bought to prevent scammers having access to them), I often get emails about peoples' accounts. I could do a password reset and own these accounts, but of course, I don't.
However, today I started getting passport scans and personal details from Syria...2 -
I got core count shamed by a client today. He has a 64 Core ryzen and I have a quad core I7.
I want to upgrade I do! But the new tech coming out this year is just too good to not wait for! Plus I waited 8 years, I can wait a few more months. Right????10 -
Things that seem "simple" but end up taking a long ass time to actually deploy into production:
1. Using a new payment processor:
"It's just a simple API, I'll be done in 2 hours"
LOL sure it is, but testing orders and setting up a sandbox or making sure you have credentials right, and then switching from test to life and retesting, and then... fuck
2. Making changes to admin stats.
"'I just have to add this column and remove that one... maybe like a couple of hours"
YOU WISH
3. Anything Javascript
"Hah, what, that's like a button, np"
125 minutes later...
console.log('before foo');
console.log(this.foo)
etc..2 -
Can someone help me understand?
I subscribed to a nifty IT-releated magazine, and on its back, there's an ad for "Dedicated root server hosting", nothing unusual at a first glance, but after I read the issue, I decided to humor them and see what it is that they offered, and... It just... Doesn't make sense to me!
An ad for "Dedicated Root Server" - What is a dedicated root server first of all? Root servers of any infrastructure sound pretty important.
But, the ad also boasts "High speed performance with the new Intel Core i9-9900K octa-core processor", that's the first weird thing.
Why would anyone responsible enough want to put an i9 into a highly-reliable root server, when the thing doesn't even support ECC? Also, come on, octa-core isn't much, I deal with servers that have anywhere between 2 and 24 cores. 8 isn't exactly a win, even if it has a higher per-core clock.
Oh, also, further down the ad has a list of, seeming, advantages/specs of the servers, they proclaim that the CPU "incl. Hyper-Threading-Technology"... Isn't that... Standard when it comes to servers? I have never seen a server without hyperthreading so far at my job.
"64 GBs of DDR4 RAM" - Fair enough, 64 gigs is a good amount, but... Again, its not ECC, something I would never put into a server.
"2 x 8 TB SATA Enterprise Hard Drive 7200 rpm" - Heh, "enterprise hard drive", another cheap marketing word, would impress me more if they mentioned an actual brand/model, but I'll bite, and say that at least the 7200 rpm is better than I expected.
"100 GBs of Backup Space" - That's... Really, really little. I've dealt with clients who's single database backup is larger than that. Especially with 2x8 TB HDD (Even accounting for software raids on top)
This one cracks me up - "Traffic unlimited"
Whaaaat?! You are not gonna give me a limit to the total transferred traffic to the internet for my server in your data center? Oh, how generous of you, only, the other case would make the server just an expensive paperweight! I thought this ad was for semi-professionals at least, so why mention traffic, and not bandwidth, the thing that matters much more when it comes to servers? How big of a bandwidth do I get? Don't tell me you use dialup for your "Dedicated Root Server"s!
"Location Germany or Finland" - Fair enough, geolocation can matter when it comes to latency.
"No minimum contract" - Oooh, how kiiiind of you, again, you are not gonna charge me extra for using the server only as long as I pay? How nice!
"Setup Fee £60" - I guess, fair enough, the server is not gonna set itself up, only...
The whole ad is for "monthly from £55.50", that's quite the large fee for setup.
Oh, and a cherry on top, the tiny print on the bottom mentions: "All prices exclude VAT and are a subject to..." blah blah blah.
Really? I thought that this sort of almost customer deceipt is present only in the common people's sphere!
I must say, there's being unimpressed, and then... There's this. Why, just... Why? Anyone understands this? Because I don't...12 -
Hey Hey!
Have a look at my latest Ubuntu theme.
Displaying CPU-Power Manager where i can overlock, take control of my processor.
Drop down Terminal with a transparacy Gnome theme.
Quite far to go for someone with limited knowledge at the moment.
Any advice and feedback is welcome! :)9 -
I don't mind Apple marketing themselves as these revolutionary thinkers and innovators, because I figured most people see behind the marketing but appreciate Apple for what it is. It's a big company that makes well built products, that are efficient and give good support to those products.
But I'm sick to death of tech journalists talking about how every new feature is the death of Android. They have to be kidding themselves if they think what Apple's doing is innovating. Samsung's been designing screens for the bezelless market for a LONG time, and their technology in that is incredibly advanced (it's why if you use their iPhone x you'll be looking at a screen from Samsung!)
They finally adopted wireless charging and pretended it was brand new, but I remember when they came out with the Apple watch, marketing it like they'd broken ground when Android Wear watches had been out for a year!
I don't want people to think I hate Apple, I own a few of their products. I think they're remarkably invested in user privacy; homekit imo is one of the most forward thinking implementations of smart home technology that I've seen, and the new processor in the iPhone x is a Mammoth powerhouse. So, I'm not necessarily saying anything about that, but what I am saying is that they're iñcredible at marketing, but fanboys but are not self-aware can enough to recognize when the Designed-by-Apple hype over shadows the actual objectivity or the situation. There are articles already talking about Apple's wireless charging.
TL;DR I swear to god if an apple fanboy comes at me saying the bezelless design was Apple's innovation, I'm going to snap. I appreciate what Apple does well, but unfortunately people can't appreciate a product without needing to identify with it.6 -
Oh I have quite a few.
#1 a BASH script automating ~70% of all our team's work back in my sysadmin days. It was like a Swiss army knife. You could even do `ScriptName INC_number fix` to fix a handful of types of issues automagically! Or `ScriptName server_name healthcheck` to run HW and SW healthchecks. Or things like `ScriptName server_name hw fix` to run HW diags, discover faulty parts, schedule a maintenance timeframe, raise a change request to the appropriate DC and inform service owners by automatically chasing them for CHNG approvals. Not to mention you could `ScriptName -l "serv1 serv2 serv3 ..." doSomething` and similar shit. I am VERY proud of this util. Employee liked it as well and got me awarded. Bought a nice set of Swarowski earrings for my wife with that award :)
#2 a JAVA sort-of-lib - a ModelMapper - able to map two data structures with a single util method call. Defining datamodels like https://github.com/netikras/... (note the @ModelTransform anno) and mapping them to my DTOs like https://github.com/netikras/... .
#3 a @RestTemplate annptation processor / code generator. Basically this dummy class https://github.com/netikras/... will be a template for a REST endpoint. My anno processor will read that class at compile-time and build: a producer (a Controller with all the mappings, correct data types, etc.) and a consumer (a class with the same methods as the template, except when called these methods will actually make the required data transformations and make a REST call to the producer and return the API response object to the caller) as a .jar library. Sort of a custom swagger, just a lil different :)
I had #2 and #3 opensourced but accidentally pushed my nexus password to gitlab. Ever since my utils are a private repo :/3 -
Who are devranters?
I know many devs and very few of them run Linux as their primary OS. And I've never met a single one using Arch.
Also, hardly any use Vim as their primary IDE...or even editor.
Yet, if DevRant was my first introduction to devs I'd be down Best Buy looking for a laptop (why so many laptops here?) running Arch and Vim as my word processor.
Don't misunderstand me---I have nothing against Arch and Vim. I don't give a rat's arse about the OS on my machine as I'm mostly in apps. I'm sure Arch would be fine. And whatever floats anyone's boat is fine by me.
But where are all the devs maintaining VB6 apps using XP? Is the community inclusive enough to welcome them?
Where are the "dark matter" devs? Lurking? Speak up!
Now, it may be that, say, China and India run on Arch Linux and Vim and I have a limited perspective. If so, Wow! My eyes are opened.10 -
OK so... project I've been working on! It's a virtual processor that runs in the browser coded in JavaScript. OK so I know, I know, you must be thinking, "this is crazy!" "Why would she do this?!?!" and I understand that.
The idea of Tangible is is to see if I can get any tangible performance over JavaScript. I've posted a poorly drawn diagram below showing how tangible works.
The goal for tangible is to not use html, javascript, or CSS. Instead, you would use, say for instance, c++ and write your web page in that, then you compile it using my clang plugins and out pops your bytecode for Tangible. No more CSS, no more html, and no more javascript. Instead everything from a textbox to a video on your web page is an object, each object can be placed into a container, each container follows specific flag rules like: centerHorizontal or centerVertical.
Added to all of this you get the optimization of the llvm optimizer.18 -
PayPal = GayPal
PHASE 1
1. I create my personal gaypal account
2. I use my real data
3. Try to link my debit card, denied
4. Call gaypal support via international phone number
5. Guy asks me for my full name email phone number debit card street address, all confirmed and verified
6. Finally i can add my card
PAHSE 2
7. Now the account is temporarily limited and in review, for absolutely no fucking reason, need 3 days for it to be done
8. Five (5) days later still limited i cant deposit or withdraw money
9. Call gaypal support again via phone number, burn my phone bill
10. Guy tells me to wait for 3 days and he'll resolve it
PHASE 3
11. One (1) day later (and not 3), i wake up from a yellow account to a red account where my account is now permanently limited WITHOUT ANY FUCKING REASON WHY
12. They blocked my card and forever blocked my name from using gaypal
13. I contact them on twitter to tell me what their fucking problem is and they tell me this:
"Hi there, thank you for being so patient while your conversation was being escalated to me. I understand from your messages that your PayPal account has been permanently limited, I appreciate this can be concerning. Sometimes PayPal makes the decision to end a relationship with a customer if we believe there has been a violation of our terms of service or if a customer's business or business practices pose a high risk to PayPal or the PayPal community. This type of decision isn’t something we do lightly, and I can assure you that we fully review all factors of an account before making this type of decision. While I appreciate that you don’t agree with the outcome, this is something that would have been fully reviewed and we would be unable to change it. If there are funds on your balance, they can be held for up to 180 days from when you received your most recent payment. This is to reduce the impact of any disputes or chargebacks being filed against you. After this point, you will then receive an email with more information on accessing your balance.
As you can appreciate, I would not be able to share the exact reason why the account was permanently limited as I cannot provide any account-specific information on Twitter for security reasons. Also, we may not be able to share additional information with you as our reviews are based on confidential criteria, and we have no obligation to disclose the details of our risk management or security procedures or our confidential information to you. As you can no longer use our services, I recommend researching payment processors you can use going forward. I aplogise for any inconvenience caused."
PHASE 4
14. I see they basically replied in context of "fuck you and suck my fucking dick". So I reply aggressively:
"That seems like you're a fraudulent company robbing people. The fact that you can't tell me what exactly have i broken for your terms of service, means you're hiding something, because i haven't broken anything. I have NOT violated your terms of service. Prove to me that i have. Your words and confidentially means nothing. CALL MY NUMBER and talk to me privately and explain to me what the problem is. Go 1 on 1 with the account owner and lets talk
You have no right to block my financial statements for 180 days WITHOUT A REASON. I am NOT going to wait 6 months to get my money out
Had i done something wrong or violated your terms of service, I would admit it and not bother trying to get my account back. But knowing i did nothing wrong AND STILL GOT BLOCKED, i will not back down without getting my money out or a reason what the problem is.
Do you understand?"
15. They reply:
"I regret that we're unable to provide you with the answer you're looking for with this. As no additional information can be provided on this topic, any additional questions pertaining to this issue would yield no further responses. Thank you for your time, and I wish you the best of luck in utilizing another payment processor."
16. ARE YOU FUCKING KIDDING ME? I AM BLOCKED FOR NO FUCKING REASON, THEY TOOK MY MONEY AND DONT GIVE A FUCK TO ANSWER WHY THEY DID THAT?
HOW CAN I FILE A LAWSUIT AGAINST THIS FRAUDULENT CORPORATION?12 -
One thing that’s a shocker and frankly very weird for people who have always used Android, is that iPhone doesn’t show any progress notif for anything whatsoever. Like dude.. I want things to happen in background and see progress in notif bar. But no, not in iPhone. You either wait for things to finish in foreground or do it explicitly inside the relevant app.
For example, when you want to send a big video on WhatsApp via Photos, you have to wait on the Photos screen until it’s sent otherwise it fucking fails. Like dude.. wtf?! Why can’t that happen in background?
On top of this, things that can happen in background have so limited processing power to themselves (because iPhone doesn’t like things happening in background; we have already established that though) that they just crawl until done and sometimes fail.
Another thing is that there are no fucking loading indicators. You touch something and then the guessing games starts whether you touched it correctly or not. Like dude.. I know your phone got a superfast processor but sometimes things take time to happen. You gotta give some kind of indication that things are happening ffs!
I know security and all, but dude you gotta give me something! Don’t make me suffer for little things.
Dude.. fuck you!6 -
Fuck Windows 10. Period.
An amateur shit-show of junk. If you have an i3 processor it will find a way to choke it to 80% with the bloody audiodg.exe.
I have an i7 and takes 25% CPU from Windows Graph Audio Isolation to play a YouTube video and 12-13 % when idle.
Junk spaghetti with some half-useless UI over the same settings that were available in much older Windows versions.
I hate having a decent 16 GB ram, 512 SSD and Radeon and so on laptop, for it to be disabled and abused by Windows and Chrome.15 -
Many people asked me this.
Every programming language is made of another, and because of it is the lowest level language every language is made of it. So what does assembly made of?
...
When you buy a vacuum cleaner they give you instructions to how to use it. When processor producer creates a processor they give an instruction to how to use it. Assembly programming language is nothing but an instruction that processor producer gave us.5 -
Conversation yesterday (senior dev and the mgr)..
SeniorDev: "Yea, I told Ken when using the service, pass the JSON string and serialize to their object. JSON eliminates the data contract mismatch errors they keep running into."
Mgr: "That sounds really familiar. Didn't we do this before?"
SeniorDev: "Hmmm...no. I doubt anyone has done this before."
Me: "Yea, our business tier processor handled transactions via XML. It allowed the client and server to process business objects regardless of platform. Partners using Perl,
clients using Delphi, website using .aspx, and our SQLServer broker even used it."
Mgr: "Oh yea...why did we stop using it?"
Me: "WCF. Remember, the new dev manager at the time and his team broke up the business processor into individual WCF services."
Mgr: "Boy, that was a crap fest. We're still fighting bugs from the mobile devices. Can't wait until we migrate everything to REST."
SeniorDev: "Yea, that was such a -bleep-ing joke."
Me: "You were on Jake's team at the time. You were the primary developer in the re-write process saying passing strings around wasn't the way true object-oriented developers write code.
So it's OK now because the string is in JSON format or because using a JSON string your idea?"
SeniorDev turns around in his desk and puts his headphones back on.
That's right you lying SOB...I remember exactly the level of personal attacks you spewed on me and other developers behind our backs for using XML as the message format.
Keep your fat ass in your seat and shut the hell up.3 -
WHY IS IT SO FUCKIN ABSURDLY HARD TO PUSH BITS/BYTES/ASM ONTO PROCESSOR?
I have bytes that I want ran on the processor. I should:
1. write the bytes to a file
2a. run a single command (starting virtual machine (that installed with no problems (and is somewhat usable out-of-the-box))) that would execute them, OR
2b. run a command that would image those bytes onto (bootable) persistent storage
3b. restart and boot from that storage
But nooo, that's too sensible, too straightforward. Instead I need to write those bytes as a parameter into a c function of "writebytes" or whatever, wrap that function into an actual program, compile the program with gcc, link the program with whatever, whatever the program, build the program, somehow it goes through some NASM/MASM "utilities" too, image the built files into one image, re-image them into hdd image, and WHO THE FUCK KNOWS WHAT ELSE.
I just want... an emulator? probably. something. something which out of the box works in a way that I provide file with bytes, and it just starts executing them in the same way as an empty processor starts executing stuff.
What's so fuckin hard about it? I want the iron here, and I want a byte funnel into that iron, and I want that iron to run the bytes i put into the fuckin funnel.
Fuckin millions of indirection layers. Fuck off. Give me an iron, or a sensible emulation of that iron, and give me the byte funnel, and FUCK THE FUCK AWAY AND LET ME PLAY AROUND.8 -
Just had an argument with someone who thinks (micro)python is the way to go for embedded projects. Cause a lot of engineers are terrible at using C/C++. And their argumets for optimisation and granular controll over what the processor does is not necessary.
Its utterly wrong to push technologies into areas for which they werent originally designed for. We've seen it alot with websites lately and I dont like that embedded is heading the same way!18 -
can i work in any more horrible company than this?
> got a shitty macbook air as official work laptop. i am an Android dev btw, nd fuck knows how long it took to build apps on this, but it was still okay
> after 1 year some keys started getting slow to respond but still working fine
> recently a Senior dev raised request for better laptops and somhow we all got macbook pros woth good ram/processor
> returned my old laptop, got a mail after few seconds that my laptop has liquid damage! (in retrospect , i think i knew it as my bag once got drenched in rain)
> few days later, a mail chain starts where some guy is asking for $300 approval of fixes from my boss's boss!
now fuck knows how is it going to get paid, but i cant afford it on my monthly salary.
i am already on a tight crunch as my dad recently lost his job and i am paying emis for a car loan as well as a hand fracture loan, but i am surprised that am getting notified about this.
afaik,
1. the laptop's whole value is around $350 (some corporate quote that i once saw) .
2. the laptops should be fucking insured (we ourselves are a fuckin general insurance company) as its an obvious norm in corporate equipments. i shouldn't be penalised for this
3. i was working fine with this laptop and i can still work on it if given back.
4. this can be deducted at the time of fnf or from gratuity fund that these assholes hold onto until a guy completes 5 years and take it all for themselves if he doesn't.
5 i can buy this shitty laptop back and use it as my personal device, or get it repaired for less.
i don't even claim to have damaged it, why are they putting it on me 😭😭😭8 -
Don't feed the pigeons.
A cautionary tale.
When you feed the pigeons they keep coming back. They don't stop pestering you for help, and they don't ever listen to you.
I gave my father-in-law my old laptop, and installed the latest version of Office 2016 because I'm a nice guy.
Now, every week at family dinner there's something he needs me to help him with.
Mind you, his previous computer had Windows XP and the one I gave him had Windows 7. So it was quite the texh upgrade for him.
Except one of his octagenarian siblings wrote a family recipe book, and wrote it in Word Processor. (because Old People!) Well fuck of course it has pictures, clip art, special formatting, vertical and horizontal lines. It worked fine on XP because Word Processor was supported by XP.
The following is me explaining to him over the phone why his recipe book wouldn't load into Word. I was in his house picking up 2000 rounds of ammo for my and my wife's pistols (target practice) while he was out and about.
FIL: "It's the link on the desktop. It comes up in Word on the old computer but when I tried to put it on the new computer it wouldn't work. I used a thumb drive."
Me: "Okay well I tried to..."
FIL: "I don't know why it would work in Word on one computer and not the next."
Me: "Okay, well I clicked on the link to the file on your old desktop and it opened in Word Processor, not Word."
FIL: "No it opens in Word on the old computer, but it won't open on the new one."
Me: "It opens in Word Processor on the old computer, it won't open in Word on..."
FIL: "Which computer are you sitting at? The old one is on the left." (as if I wouldn't recognize the computer I had for three years and just gave him a month ago!)
Me: "The old one."
FIL: "Okay so it should open in Word on the old computer."
Me: "It won't. It will open in..."
FIL: "I was thinking maybe it had something to do with a screen that popped up when I logged in to the new computer. Something about antivirus software?"
Me: "It will open in Word Processor on your old computer, but it isn't formatted..."
FIL: "Yeah, it's a '.-w-p-s' file so it should work in Word."
Me: "Word Processor is a different program from Word. This opens in Word Processor."
(long silence)
FIL: "So which one do I have?"
Me: "You have Word Processor on the old computer."
FIL: "So how do I get Word Processor on the new computer?"
Me: "You don't. It is defunct software, it was discontinued ten years ago. You can try to get a converter online, but there's no guarantee it'll work."
FIL: "Alright, I'll be home in a few minutes. I'll take a look then."
This was at 10pm last night, and I'd been out all day since 7:30am. He still didn't believe me that the book was written in Word Processor until I showed him the different startup screen for Word Processor, where it says "Word Processor" plain as day.
I fed the pigeon. And it looks like there's more of this to come.3 -
CIA – Computer Industry Acronyms
CD-ROM: Consumer Device, Rendered Obsolete in Months
PCMCIA: People Can’t Memorize Computer Industry Acronyms
ISDN: It Still Does Nothing
SCSI: System Can’t See It
MIPS: Meaningless Indication of Processor Speed
DOS: Defunct Operating System
WINDOWS: Will Install Needless Data On Whole System
OS/2: Obsolete Soon, Too
PnP: Plug and Pray
APPLE: Arrogance Produces Profit-Losing Entity
IBM: I Blame Microsoft
MICROSOFT: Most Intelligent Customers Realize Our Software Only Fools Teenagers
COBOL: Completely Obsolete Business Oriented Language
LISP: Lots of Insipid and Stupid Parentheses
MACINTOSH: Most Applications Crash; If Not, The Operating System Hangs
AAAAA: American Association Against Acronym Abuse.
WYSIWYMGIYRRLAAGW: What You See Is What You Might Get If You’re Really Really Lucky And All Goes Well.2 -
Well, I wanna specialize in low-level software as I get older. Everyone is telling me to go out and learn a processor architecture. I'm willing to be patient, so I do what people recommend to me and I download the Intel x86_64 manual. I was excited... UNTIL I REALIZED THE MANUAL WAS 4474 PAGES LONG! Like, how am I supposed to jump into assembly, machine language, and low-level programing with a beginner's task like that? I cannot find ANY resources online to simplify the transition, and college sure ain't gonna teach me anytime soon.10
-
I want to play GTA V on full specs - but computer doesn't have enough juice. Anyone knows the name of the website from where I can download an i7 processor, 5TB of Memory, and 16GB RAM. I have already tried SourceForge. Please post the link below, if you happen to know one.3
-
Upscaling a prod database which was running on an 8 year old Dell desktop used as server. It had about 2MB of RAM and an Intel Core 2 processor...
This was the day I've learned a lot about querying the database as efficient as humanly possible.3 -
Windows makes me genuinely angry. Why is it that when I boot my computer, I am expected to wait 10+ minutes for windows to launch 5 startup applications, most of which are already patches for things that should be there to begin with, before I can even begin to use explorer to open GeForce experience because for some reason, windows said "Graphics drivers?! Who needs those?!" And threw them out the window! And then I get notifications about apps needing permissions to things, BUT IT WONT TELL ME WHICH ONE! I clicked the update driver notification 5 minutes ago and the installer literally just now opened up. This is a computer with a r3 processor and gtx970! It may not be the best, but it is by no means underpowered! Why must Halo online not have a Linux version? :(4
-
ME - me, TM - teammate
I was just recruited to the company. We're starting new project based on few modules.
ME: So this module will do X and Y, I will use good old interfaces and design based on abstractions so that stuff does not get glued too much.
TM: But why? Make good old processor with all the logic and throw objects at it.
ME: B-but unit tests, decomposition and othet stuff...
TM: *insists and forces me to agree*
ME: *gets shit done his way, TM checks on code review and complains but generally doesnt give a fuck*
ME: Ok, its done. Lets get shit shipped.
TM: Well, we were just told by PM that we will need to process one more source with much different logic that does not fit current solution (he did meant GOD-PROCESSOR, idea of his).
ME: What do you mean? *injects another contextual implementation of processing logic to template method pattern solution*.
TM: I will tell PM you cant make it because of the implementation.
ME: But I just did it...
TM: Impossible, processor needs to be reimplemented. Get your shit together!
ME: *still doesnt get the shit about the god processor love*
TM: *rage quits next month*
ME: *module gets reused once more 2 month later, profit* -
How are people writing documents/reports? Markdown/Latex in your fav editor and export to PDF? Word? Pages? Libre Office?
I use Markdown, via VSCode, Bear, iA Writer, Marked and then push to PDF. If I have to, then export to DOCX.5 -
A Rant that took my attention on MacRhumors forum.
.
I pre-calculated projected actual overall cost of owning my i5/5/256 Haswell Air, which I got for $1500.
After calculations, this machine would cost me about $3000 for 3 years of use.
(Apple Care, MS Office Business, Parallels, Thunderbolt adapter to HDMI, Case... and so on).
Yea... A lot of people think it's all about the laptop with Apple. nah... not at all. There's a reason Apple is gradually dropping the price of their laptops.
They are slowly moving to a razor and blade business model... which basically is exactly what it sounds like - you buy the razor which isn't too expensive, but you've got no choice but to buy expensive additional blades.
I doubt Apple is making much money from laptop sales alone... well definitely not as much as they were making 5 years or so ago (remember the original air was about $1800 for base model, and if i remember correctly - $1000 additional dollars to upgrade to 64GB SSD from the base HDD.
Yes, ONE THOUSAND DOLLARS FOR 64GB SSD!
Well, anyways, the point is that Apple no longer makes them BIG bucks from the laptop alone, but they still make good profits from upgrades. $300 to go to 512GB SSD from 256, $100 for 4GB extra ram, and $150 for a small bump in processor. They make good profits from these as well.
But that's not where they make mo money. It's once you buy the Macbook, they've got you trapped in their walled garden for life. Every single apple accessory is ridiculously overpriced (compared to market standards of similar-same products).
And Apple makes their own cables and ports. So you have to buy exclusively for Apple products. Every now and then they will change even their own ports and cables, so you have to buy more.
Software is exclusive. You have no choice but to buy what apple offers... or run windows/linux on your Mac.
This is a douche level move comparable to say Mircrosoft kept changing the usb port every 2-3 years, and have exclusive rights to sell the devices that plug in.
No, instead, Intel-Microsoft and them guys make ports and cables as universal as possible.
Can you imagine if USB3.0 was thinner and not backwards compatible with usb2.0 devices?
Well, if it belonged to Apple that's how it would be.
This is why I held out so long before buying an apple laptop. Sure, I had the ipod classic, ipod touch, and more recently iPad Retina... but never a laptop.
I was always against apple.
But I factored in the pros and cons, and I realized I needed to go OS X. I've been fudged by one virus or another during my years of Windows usage. Trojans, spywares. meh.
I needed a top-notch device that I can carry with me around the world and use for any task which is work related. I figured $3000 was a fair price to pay for it.
No, not $1500... but $3000. Also I 'm dead happy I don't have to worry about heat issues anymore. This is a masterpiece. $3000 for 3 years equals $1000 a year, fair price to pay for security, comfort, and most importantly - reliability. (of course awesome battery is superawesome).
Okay I'm going to stop ranting. I just wish people factored in additional costs from owning an a mac. Expenses don't end when you bring the machine home.
I'm not even going to mention how they utilize technology-push to get you to buy a Thunderbolt display, or now with the new Air - to get a time capsule (AC compatible).
It's all about the blades, with Apple. And once you go Mac, you likely won't go back... hence all the student discounts and benefits. They're baiting you to be a Mac user for life!
Apple Marketing is the ultimate.
source: https://forums.macrumors.com/thread...3 -
I should stop trying to be a developer and become a comedian, or perhaps a meme maker or something...
My rants are better than codes.
I mean, I got 100+ +1s on a rant and my code never runs!
https://devrant.io/rants/654457/...3 -
So for a while I have wanted to build a raspberry pi cluster. In the spirit of shia labeouf I got started last saturday.
I had two pies lying around so I figured I'd run some experiments before I invested in a lot of hardware. After about a day I had turned the two pies into a shared cluster when disaster struck....
I had completely ignored the fact that you cannot run 32 or 64bit software on an arm processor (I know... I'm a java developer). So when I booted my service and the load balancer, I found that nothing worked. So pretty bumbed out, I quit the project.
Later that day I found a crazy guy who had bought a batch of 400 small form factor PSUs (300W) and internally I laughed at him a little. I mean, who's gonna sell 300W irregular power supplies. Then, just as I was about to go to bed I found this guy, he was selling from a batch of CPU-onboard motherboard for 10 bucks each and everything clicked!
I did some quick calculations and decided I could probably gather enough cash to get: 10 motherboards, 10 2GB ram dimms, 10 Sata disks and 14 PSU (in case some fail) and some misc hardware for networking and such.
So... Long story short, I am going to build a cluster computer, the first version is going to have 10 nodes and I am waiting for delivery right now!12 -
Can anyone help me choose an antivirus for Windows ? If it is free it would be great !!
I am a full time linux lover, but I have to set up a windows pc for my parents, possibly win 7 on Intel NUC with atom processor.
The requirement is just ms office and similar applications.15 -
RONALD REAGAN VIRUS: Saves your data, but forgets where it's stored.
MIKE TYSON VIRUS: Quits after two bytes.
OPRAH WINFREY VIRUS: Your 300 MB hard drive suddenly shrinks to 100 MB, then slowly expands to 200 MB.
TITANIC VIRUS: Your whole computer goes down.
DISNEY VIRUS: Everything in your computer goes Goofy.
PROZAC VIRUS: Screws up your RAM but your processor doesn't care.
ARNOLD SCHWARZENEGGER VIRUS: Terminates some files, leaves, but will be back.4 -
Monday morning we found out our main event queue hadn't processed since late Wednesday afternoon. Shit was hitting the fan and we were stumped. What had changed?!?. Why wasn't the queue processor running?!?
Turns out a server restart had killed the job (no worries there, surely?!) but turns out the job checked for a system flag on disk to stop it running multiple instance or in this case as the flag was still present any instance at all. Got to love the little things that really screw you over.6 -
Guys, what the fuck.
Today i was doing some consistancy checks accross the board after update made for one of our core systems that manages money. Yeah, real, live money.
I have hidden from public payment processor with simple API etc. So one of my checks, gate has same balances as gate's internal account on core blinked red. Okay well, fuck, thats really really shitty situation to be in. I guess my gate is fucked up some way.
Okay, debug mode on, maintainence mode on, quick look at DB, oh shit, client payed 4 times 15k eur without any txn on core system... SHIT! postman... Fuck, postman ofc wont start, quick google, fixing postman, tention in me grows, becouse its really rough and tough fuckup on my side, and got call. That moment when you know someone already knows is for me apogeum of stress that just skyrocketed from calm morning to mad morning.. Okay, i pick up phone, and I hear that one client payed (using core system app) and got strange message, YES I KNOW, im working on it.. Wait, you say that core system gave them odd message??? I will check it out. Finally fixed postman, 3 requests and I know its bug on core system.
Why, why in the motherfucking blody world anyone would push critically bugged update to system that just sends api callbacks "yes, he payed" when someone didnt pay...
Fuck im stressed and pissed, but at same time reliefed its not my personal fuckup (yeah, I solo wrote that gate, but externally audited code and all they had to say that some cosmetic linting should be done)2 -
In my current org we had a AWS SES event processor written in node js, it was struggling everytime we had more than 1000 messages in queue. It looped over every single message made some db calls then processed the next message. At one point we had to run 300 comatiners of this thing to clear out the queue.. It was still horribly slow.
I rewrote it in Golang with channels and goroutines now we need to run a single comatiner to handle upto 100k messages in queue. Used 10 goroutines to pull 10 messages constantly and put them in a channel, then spawned 1 goroutine per message to process them quickly. I'm so proud of this solution, we then brought this workflow to many other event processing services. 😎4 -
GOD DAMN IT COLLEGE YOU DID IT AGAIN. for real college can go suck Satan's 50 inch red cock for all I care.
A professor asked me to design a processor and I'll get a bonus. I said okay cool nothing hard.
oh but it has to be in verilog.
okay cool.
oh and it has to be on this fucking ancient useless piece of shit called xilinx that the fucking college provides to you only via a fucking 50 gigabyte virtual machine.
sigh. okay..... challenge accepted.
It fucking crashes every 2 minuites. And after 3 days of no sleep. I finally finished the Alu, Control unit, 4k memory, 8 registers and the busses.......... BUT THEN THE ENTIRE VIRTUAL MACHINE CRASHED AND LOST ALL PROGRESS...... fml.
and the professor only gave me the bonus for the Alu. sigh. fuck college.11 -
(Warning: This rant includes nonsense, nightposting, unstructured thoughts, a dissenting opinion, and a purposeless, stupid joke in the beginning. Reader discretion is advised.)
honestly the whole "ARM solves every x86 problem!" thing doesn't seem to work out in my head:
- Not all ARM chips are the same, nor are they perfectly compatible with each other. This could lead to issues for consumers, for developers or both. There are toolchains that work with almost all of them... though endianness is still an issue, and you KNOW there's not gonna be an enforced standard. (These toolchains also don't do the best job on optimization.)
- ARM has a lot of interesting features. Not a lot of them have been rigorously checked for security, as they aren't as common as x86 CPUs. That's a nightmare on its own.
- ARM or Thumb? I can already see some large company is going to INSIST AND ENFORCE everything used internally to 100% be a specific mode for some bullshit reason. That's already not fun on a higher level, i.e. what software can be used for dev work, etc.
- Backwards compatibility. Most companies either over-embrace change and nothing is guaranteed to work at any given time, or become so set in their ways they're still pulling Amigas and 386 machines out of their teeth to this day. The latter seems to be a larger portion of companies from what I see when people have issues working with said company, so x86 carryover is going to be required that is both relatively flawless AND fairly fast, which isn't really doable.
- The awkward adjustment period. Dear fuck, if you thought early UEFI and GPT implementations were rough, how do you think changing the hardware model will go? We don't even have a standard for the new model yet! What will we keep? What will we replace? What ARM version will we use? All the hardware we use is so dependent on knowing exactly what other hardware will do that changing out the processor has a high likelihood of not being enough.
I'm just waiting for another clusterfuck of multiple non-standard branching sets of PCs to happen over this. I know it has a decent chance of happening, we can't follow standards very well even now, and it's been 30+ years since they were widely accepted.5 -
TL;DR; do your best all you like, strive to be the #1 if you want to, but do not expect to be appreciated for walking an extra mile of excellence. You can get burned for that.
They say verbalising it makes it less painful. So I guess I'll try to do just that. Because it still hurts, even though it happened many years ago.
I was about to finish college. As usual, the last year we have to prepare a project and demonstrate it at the end of the year. I worked. I worked hard. Many sleepless nights, many nerves burned. I was making an android app - StudentBuddy. It was supposed to alleviate students' organizational problems: finding the right building (city plans, maps, bus schedules and options/suggestions), the right auditorium (I used pictures of building evac plans with classes indexed on them; drawing the red line as the path to go to find the right room), having the schedule in-app, notifications, push-notifications (e.g. teacher posts "will be 15 minutes late" or "15:30 moved to aud. 326"), homework, etc. Looots of info, loooots of features. Definitely lots of time spent and heaps of new info learned along the way.
The architecture was simple. It was a server-side REST webapp and an Android app as a client. Plenty of entities, as the system had to cover a broad spectrum of features. Consequently, I had to spin up a large number of webmethods, implement them, write clients for them and keep them in-sync. Eventually, I decided to build an annotation processor that generates webmethods and clients automatically - I just had to write a template and define what I want generated. That worked PERFECTLY.
In the end, I spun up and implemented hundreds of webmethods. Most of them were used in the Android app (client) - to access and upsert entities, transition states, etc. Some of them I left as TBD for the future - for when the app gets the ADMIN module created. I still used those webmethods to populate the DB.
The day came when I had to demonstrate my creation. As always, there was a commission: some high-level folks from the college, some guests from businesses.
My turn to speak. Everything went great, as reversed. I present the problem, demonstrate the app, demonstrate the notifications, plans, etc. Then I describe at high level what the implementation is like and future development plans. They ask me questions - I answer them all.
I was sure I was going to get a 10 - the highest score. This was by far the most advanced project of all presented that day!
Other people do their demos. I wait to the end patiently to hear the results. Commission leaves the room. 10 minutes later someone comes in and calls my name. She walks me to the room where the judgement is made. Uh-oh, what could've possibly gone wrong...?
The leader is reading through my project's docs and I don't like the look on his face. He opens the last 7 pages where all the webmethods are listed, points them to me and asks:
LEAD: What is this??? Are all of these implemented? Are they all being used in the app?
ME: Yes, I have implemented all of them. Most of them are used in the app, others are there for future development - for when the ADMIN module is created
LEAD: But why are there so many of them? You can't possibly need them all!
ME: The scope of the application is huge. There are lots of entities, and more than half of the methods are but extended CRUD calls
LEAD: But there are so many of them! And you say you are not using them in your app
ME: Yes, I was using them manually to perform admin tasks, like creating all the entities with all the relations in order to populate the DB (FTR: it was perfectly OK to not have the app completed 100%. We were encouraged to build an MVP and have plans for future development)
LEAD: <shakes his head in disapproval>
LEAD: Okay, That will be all. you can return to the auditorium
In the end, I was not given the highest score, while some other, less advanced projects, were. I was so upset and confused I could not force myself to ask WHY.
I still carry this sore with me and it still hurts to remember. Also, I have learned a painful life lesson: do your best all you like, strive to be the #1 if you want to, but do not expect to be appreciated for walking an extra mile of excellence. You can get burned for that. -
I need a new Laptop.
-AMD A8 processor 2,5ghz
- Windows 10 64 bit
-8gb ram
-1tb HDD
Etc..
Is this enough for programming, using eclipse and so on?16 -
devRant is sooo cool. My new
Word-O-Matic physical keyboard word processor setup will increase my rant efficiency. Used velcro to fasten the phone and keyboard on a portfolio notebook! Way faster than virtual keyboard. -
Nothing better than watching sshd generate a new set of keys every time you boot your 300Mhz ARM processor. Just because the entire filesystem is in RAM.2
-
Ok, a customer came to us saying he had a product that is just randomly rebooting. It sure must be a software issue.
I got the task and worked my way through ~10k lines of assembly code (8085 processor on board) Weeks go by, i tested every single god damn funktion they had, analyzed every vector they put in, finding NOTHING...
Meanwhile the hardware department analyzed and tested some possible culprits on the product for me. I had NO idea what the problem could be...
Then hardware department said: oh, they forgot a resistor on the FUCKIN RESET PIN OF THE PROCESSOR!!!!!
fml...5 -
Intel 8085 micro-processor, anyone?
In my graduation, one of the semesters had Intel 8085 programming in the curriculum. It's because of that dev-kit I understood what assembly-level language means.
A simple scenario of adding two numbers would result in half a page long sequence of commands that literally didn't excuse any mistakes.
It made me understand the semantics or basically what we get taught as "middle level" languages.
We had to memorize the exact pins of the thing and had to draw it from memory. And we had to learn the instruction set it had.
Later we had to learn Intel 8086 but its instruction set was way too complicated and I gave up on it.
I know it sounds geeky but I randomly remembered it today.13 -
I will not miss you bitch. See screenshot. I received new hardware. I will use a laptop with good specs as server. My dad bought it from his previous employer because he went for retirement. It has an ultrabook-grade 11th gen processor and he only bought it for 350,- euro. His former employer was a school, they don't give a fuck about money like a commercial company would do in such case. It's originally bought with tax money anyway.
https://llm.molodetz.nl is currently online but not for long, i hope to have smth running at end of the weekend. Probably a 7b model. I have plans with it that require some performance so I won't use the heavy ones.
Retoor1b currently is 0.5b or 1.5b. I forgot. The models with lower parameter count are a bit more naive and trainable like a kid. They're also not very biased yet. So, that will be my main new challenge. How to make a chat bot unethically human. No political correctness under this roof.
Would be nice if i could make it a bit like bratgpt. Sounds like a joke, but that model is expensive as fuck. You'll be shocked. But i would like to implement some sarcasm in it. A bit unpredictable. But normally such configuration escalates into very weird behavior.
My 'server' has a freaking 4K screen and i'm working on a decade old laptop. But seriously, the keyboard of the new one sucks. Nothing beats a x270. * tik tik tik * rakketakketak *. My previous x270 missed four keys. The three x270's i had, all had familiar experience but still different. The other two would never lose a key I guess. I configured the new 'server' that it safes battery, configured for mostly on AC.
I'm living on limited amount of cash (and will work again when i will run out). That's why i normally don't spend money myself on such things. So i'm now very happy. Fuck, this was about to be rant about how much my AI sucks but it ended in happy stuff. Oh well...
If you're still reading, you're the best!
Edit:
Images uploading broke again. Here is link: https://devrant.molodetz.nl/llm.png9 -
Which was the worst talk with person of yours in tech field?
For me:
Seen a person want to upgrade his processor to i7 from i5.. just like windows update...5 -
it would help if i had time to learn even a little more C, as I'm bumbling my way through the Linux kernel and GodMode9 (an amazingly powerful 3DS manip tool for everything from the SD card to the NAND to literally raw FIRM0/FIRM1 bootloader access) to try amd patch some code from GM9 into the kernel to handle the SD card *properly* so Linux 3DS doesn't constantly hang when reading/writing to the SD card, to enable Wi-Fi access (same bus location and similar bus structure as SD/NAND access, different processor,) enable NAND decryption and access (yes, really, NAND is encrypted via software, which is... ...fun...) and more.
tl;dr: the 3DS hardware, C, and others' code collectively make me wanna slit my fucking wrists. Hopefully my sacrifice allows higher-level programming languages to be visble for low-level jobs in the future.4 -
My last post was a year ago. What brought me back here is the ability of AI to agree and apologize to anything and everything, while producing the worst hopeful code.
4 days I wasted, trying to make an android audio visualizer, but AI... sigh.
It gave me the wrong structure of FFT bytes emitted. I corrected it
It gave me the wrong logarithm calc, I corrected it
It gave me the wrong sampling rate, I corrected it.
It gave me the wrong texture order, I corrected it.
It gave me the wrong glsl sample2d, I corrected it.
It gave me the wrong textureID generation, I corrected it.
It gave me a render which was about 10 fps, I found out that instead of using native onDraw, I had a fcking delta time in my shader. I almost corrected it, I gave up
Lets go to code generators with Annotations.
Like always, starts very positive, until I start to correct it.
It gave me the wrong file locations, I corrected it.
It gave me the wrong order of find copy modify and write to .build, I didnt correct it.
It gave me regexes to find annotations. Im like So whats the use of an "ANNOTATION PROCESSOR"
It apologizes and used a fucking regex in the processor,..... I didnt correct it, in the end, I was left with a separate module, targetting iOS Android and JVM, with an annotation processor implemented in jvmMain, which tries to modify commonMain src by finding annotations with regexes, which wont run on app build or app sync project, but only on java -jre command pointing to that fucking .java class in that module, which takes at least 2 mins to run, and Finally generate 0 files.
I needed to rant, I understand LLMs are just models of words built and stolen from the most intelligent and dumbest people out there. But Im an idiot for getting my hopes high. I cant build anything new and unheard of. I used to do that. I once made a textView + image print util for a bluetooth printer just to say FU to libraries and heavy sdks. like literally rasterizing shit to bluetooth packets. I needed to let off some steam. I havent been here in a year so I dont know what reactions I can get from this rant. I bet someone will just say yeah we tired of 'Fuck AI' rants. but shit, it hurts. When I gave up on that visualizer, I downloaded an app, I think its called project M, like in reference to MilkDrop.. like the Winamp Milkdrop. I opened it, played something on spotify, and let my eyes go blind9 -
I got this in the shop today: a fully-working XP POS AiO with an x86 processor. 2.2GHz Celeron dual-core, 2GB RAM, 150GB HDD. Y'all can have it for free, but no company will ship it, so you gotta come get it.6
-
So yesterday I made a rant about my tablet taking a shit and going bye bye. Well, today I'd like to introduce my new Lenovo Ideapad 110 Touch 15ACL with
An AMD A8 8GB RAM processor
1TB HDD
Window 10 Pro OS
Yea im not going to list everything thats too much7 -
Thinking of turning my 6 year old intel atom processor netbook into sth useful other than just dumping it.
Any good suggestions to reuse?9 -
So today I gave my superiors at work a piece of my mind about the pc I sit on (I was/am not happy about it). I am expected to use Photoshop and Illustrator on a pc with 8gb ddr3 1333mhz and no graphics card, except for the one integrated in Intel's processor, which also is super outdated. Like you couldn't get more than $80 for the entire machine. So, today I ranted towards them irl and my message has finally seemed to sink in. Fuck.
-
FUCK YOU NODE JS AND FUCL YOU SYNOLOGY
Decided to give an old Synology DiskStation that sits at home slme new life besides just sharing files. Since Synology has SSH but not a full Linux OS, installed DebianChroot (so far so good). At one point I needed Node JS, so installed NVM and tried to install Node. Well guess what didn't work. Tried a few more things including directly downlosding node from the official node website. Trying different versions, the whole drill.
After about 5 hours of installing and errors, well really usfull errors like "There where 2 errors during installation" WELL HOW ABOUT YOU FUCKING TELL ME WHAT THE ERROR IS YOU FUCKING FUCK!
I found a formum wkith a guy haveing similar problems. Able to install legacy 0.10.x versions but not 4.x.x. Or 6.x.x oder whatever. He found that you have to have at least an ARMv6 compatible processor, otherwise it won't run. Checked it and well, that old fuck of mine only has ARMv5. FUCK! But honestly. You detect it's an ARM architecture. You detect it's not one of the v6 or v7, you try to install the general arm version, BUT YOU DON'T GET THE FUCKING IDEA TO MENTION TO CHECK WAHAT VERSION YOU HAVE AND IF THAT IS SUPORTED BY FUCKING NODE!
One afternoon wasted, at least I got a little more wisdom. Fuck do I hate Node now. On the bright side, I've ordered a Raspberry Pi and two cases for Harddisks, I'll create my own diskstation with blackjack and hookers (I realy hope you get that reference)! Fuck you Synology and Node JS (yeah yeah, it's not Synologies fault, but I'm mad anyways!)4 -
Got one right now, no idea if it’s the “most” unrealistic, because I’ve been doing this for a while now.
Until recently, I was rewriting a very old, very brittle legacy codebase - we’re talking garbage code from two generations of complete dumbfucks, and hands down the most awful codebase I’ve ever seen. The code itself is quite difficult to describe without seeing it for yourself, but it was written over a period of about a decade by a certifiably insane person, and then maintained and arguably made much worse by a try-hard moron whose only success was making things exponentially harder for his successor to comprehend and maintain. No documentation whatsoever either. One small example of just how fucking stupid these guys were - every function is wrapped in a try catch with an empty catch, variables are declared and redeclared ten times, but never used. Hard coded credentials, hard coded widths and sizes, weird shit like the entire application 500ing if you move a button to another part of the page, or change its width by a pixel, unsanitized inputs, you name it, if it’s a textbook fuck up, it’s in there, and then some.
Because the code is so damn old as well (MySQL 8.0, C#4, and ASP.NET 3), and utterly eschews the vaguest tenets of structured, organized programming - I decided after a month of a disproportionate effort:success ratio, to just extract the SQL queries, sanitize them, and create a new back end and front end that would jointly get things where they need to be, and most importantly, make the application secure, stable, and maintainable. I’m the only developer, but one of the senior employees wrote most of the SQL queries, so I asked for his help in extracting them, to save time. He basically refused, and then told me to make my peace with God if I missed that deadline. Very helpful.
I was making really good time on it too, nearly complete after 60 days of working on it, along with supporting and maintaining the dumpster fire that is the legacy application. Suddenly my phone rings, and I’m told that management wants me to implement a payment processing feature on the site, and because I’ve been so effective at fixing problems thus far, they want to see it inside of a week. I am surprised, because I’ve been regularly communicating my progress and immediate focus to management, so I explain that I might be able to ship the feature by end of Q1, because rather than shoehorn the processor onto the decrepit piece of shit legacy app, it would be far better to just include it in the replacement. I add that PCI compliance is another matter that we must account for, and so there’s not a great chance of shipping this in a week. They tell me that I have a month to do it…and then the Marketing person asks to see my progress and ends up bitching about everything, despite the front end being a pixel perfect reproduction. Despite my making everything mobile responsive, iframe free, secure and encrypted, fast, and void of unpredictable behaviors. I tell her that this is what I was asked to do, and that there should have been no surprises at all, especially since I’ve been sending out weekly updates via email. I guess it needed more suck? But either way, fuck me and my two months of hard work. I mean really, no ego, I made a true enterprise grade app for them.
Short version, I stopped working on the rebuild, and I’m nearly done writing the payment processor as a microservice that I’ll just embed as an iframe, since the legacy build is full of those anyway, and I’m being asked to make bricks without straw. I’m probably glossing over a lot of finer points here too, just because it’s been such an epic of disappointment. The deadline is coming up, and I’m definitely going to make it, now that I have accordingly reduced the scope of work, but this whole thing has just totally pissed me off, and left a bad taste about the organization.9 -
TL;DR: Microsoft updates break drivers, make unbootable. Hours wasted. Such rage.
Lol. I come home, try booting my windows desktop. Need desperately to play some videogames. Power is on. Monitor lights up. Bios splash. Windows startup spinner.
Suddenly, windows startup spinner gone, monitor shuts off. Wait 5 minutes, no change. Force power off and reboot, same behavior.
Google says it's probably a bad video driver. I don't remember installing any in the last month, but heck I don't use this computer for shit outside of games, so may as well do a full OS reinstall and hope the problem drivers are gone.
Reboot and force power off halfway through boot to let windows know something's wrong next boot. Literally no other way to get to alternate boot methods.
Run the reset. First time, percent-counter starts. I leave the room at 30% to go get a sandwich. Come back and it says it's "undoing changes". Something went wrong and I have no way of knowing what.
Oh well, I'll just try again and see what the problem was. NOPE! Completes windows reinstall without a hitch on the second attempt.
Okay, now let's get my stuff back on here. First things first, Microsoft updates for my processor, graphics card, "security". Halfway through the updates, monitor shuts off and I'm back to square one. IT WAS THE MICROSOFT DRIVER, NOT THE ONE FROM NVIDIA GEFORCE EXPERIENCE!!!!
Fucking Microsoft. To all ye who rail against Linux as a gaming platform because of its unstable drivers, observe here the stupidity of Microsoft and weep.3 -
WHY NEXTCLOUD, WHY DOES IT HAVE TO BE SO FUCKING COMPLICATED TO UPGRADE YOU TO THE LATEST VERSION??????
NOT ONLY DID THE DATABASE MIGRATIONS NOT RUN, YOU DIDN'T HAVE THE UPDATED VERSION OF THE FUCKING AUTHENTICATTION PROCESSOR IN THE OFFICIAL SERVER DOWNLOAD SO THE WHOLE THING WAS BROKEN AFTERWARDS.
I'VE WASTED 4 HOURS ON THIS FUCKING UPGRADE 😤😤😤😤😤😤😤4 -
If you have a 13900k and you have random BSODs and application crashes, use XTU and turn down your P-Cores in Performance Per Core Turning to 54x. There is some kind of bug in the turbo boost and going past the 5.4GHZ mark just doesn't work.
I've basically built this computer twice now. I replaced the motherboard because I destroyed it, replaced the RAM because I thought I had the wrong type, and now the processor which was the actual root of all my problems.4 -
Completed a python project, started as interest but completed as an academic project.
smart surveillance system for museum
Requirements
To run this you need a CUDA enabled GPU on your computer. (Highly recommended)
It will also run on computers without GPU i.e. it will run on your processor giving you very poor FPS(around 0.6 to 1FPS), you can use AWS too.
About the project
One needs to collect lots of images of the artifacts or objects for training the model.
Once the training is done you can simply use the model by editing the 'options' in webcam files and labels of your object.
Features
It continuously tracks the artifact.
Alarm triggers when artifact goes missing from the feed.
It marks the location where it was last seen.
Captures the face from the feed of suspects.
Alarm triggering when artifact is disturbed from original position.
Multiple feed tracking(If artifact goes missing from feed 1 due to occlusion a false alarm won't be triggered since it looks for the artifact in the other feeds)
Project link https://github.com/globefire/...
Demo link
https://youtu.be/I3j_2NcZQds2 -
A friend of me said that intel Pentium and i3 is the same, that all laptops with intel`s logo have the same processor, you just have to view the info. of yours...2
-
Okay so one of my friends got an offer for a more powerful server with 128GB RAM, ok processor because the current server load is high. When they got the offer of the new one I saw there was in the licenses part, Windows Server 2016. Which to me seems worst thing you can do for just using PHP, MySQL and nothing active directory or really windows specific. Can some of u please write in short why use linux for servers instead of shitdows. And it would clearly cost much less. Because I guess if other tell it they, the client, will agree...16
-
So I want to build myself a custom buttonbox for star citizen and warthunder, my hotas really doesn't have enough or fittingly placed buttons for either of those games. What would be the best approach for this?
My first thought was to take an arduino or some AVR chip on an USB connection and write a custom JoyStick driver but that would be a major pain on the buttocks.
Also would I buy an Arduino or go full custom stuff and buy some chip from TI and DIY the board completly?
On the other hand if I'm gonna tinker with stuff on my own time I propably should pick up an ARM processor so I get familiar with the architecture, but that's propably overkill.
But 8-bit AVR is so constrained so maybe if I want to expand and create something like an MFD the poor 8266 would propably just go up in flames.
Has anybody a better idea or knows some ready to rock board for this kinda stuff? Best case scenario with a Joystick driver or something?8 -
Anybody have recommendations for a laptop? A want a laptop to finish high school with but more importantly something that can be my primary computer in college for school, coding, and gaming (doesn't need to run really intense games like CoD)
I want:
•15 inch screen
•i5 or i7 processor
•500 gb storage or more
•6 gb RAM or more
•decent front webcam
•good battery life
•$600 or less
•NOT a MacBook
Thanks :)13 -
Dell is such awful machine to use with ubuntu even it was officially ubuntu installed machine it has so much issues and my work is suffering because of this machine which costs me 70k PKR. Having 8gb ram 500hdd, core i3 processor 4th gen.
I'm suffering from wifi getting disconnected time to time I couldn't find help on ubuntu forms nor on official dell site
I guess both sucks pretty bad
will atleast never buy dell machine again nor with stupid ubuntu os just3 -
Trying to install Linux on an HP Stream 7 has been way more difficult than it should have been, even when you take into account that it's a 32-bit processor with a 32-bit EFI!
First off, the only thing I've been able to get it to boot right of the bat is Android x86 and BlissOS... kind of. You would think that Android x86 would be perfect for a tablet, right? Nope, performance sucked sooooo bad.
After reading some forums, I was finally able to get Ubuntus to load up... with the limiting factor being no on-screen keyboard.
So... at the moment I guess I'm stuck with a useless Windows tablet, and probably will be for a long time (you know, since 32-bit architecture is being dropped)6 -
I gave in...
Chimera N850HK1 15.6'' Full HD IPS Display 1920x1080 Laptop
Processor
Intel® Core™ i7-7700HQ Mobile Processor (4x 2.8GHz/6MB L3 Cache) [N850HK1]
Memory
8GB DDR4-2400
Video Card
NVIDIA GeForce GTX 1050 TI GDDR5 4GB - [N850HK1]
Primary Hard Drive
1 TB 7200rpm Super Slim Laptop Hard Drive - Single Drive2 -
My brain= processor
Your mouth= raw data
I only process the logic that comes out of your mouth and typecast it to my system's logic and try to fit you in one of my objects using a visitor pattern.if I need to create a new dynamic object , my system throws a "you are special" message. -
So Vivo made a bezeless phone this time
But guess what its again gonna contain a fucking kazillion Megapixel Selfie camera with Mars Light Flash to make you look Unique paired with a shit mediatek Processor.
Also,
They have got stones from krypton and their next phone is gonna have krypton light flash just so the fuckin superman can't use that just because they partnered with marvel to get infinity stones so that they can use them in their later phones as a light for the selfie flash.
Guess what? Thanos preodered it ..
Well Played Vivo3 -
Turns out the reason the selection of RA cores on the PS3 sucks so much dick is that it has a half-complete OpenGL implementation.
Fuck it, we're doing it live. I've got CFW, i'll dynarec if I damn well please. If it won't compile i'll fucking make it compile, I don't give a fuck about "sanity restraints" and "needing OpenGL 2" and "LV1 required" I can fucking replace all of LV1 and LV2 if I fucking want FUCK YOU
The Cell processor is pretty beefy, I could prolly do it all-CPU if I wanted.8 -
Just had a meeting about performance and monitoring. The main topic of the meeting was to be aware of disk space usage. If there are issues with memory leaks or processor hogging don't worry those are fine, just give it more.1
-
Guys, I'm planning to buy a new laptop for Dev purpose. My budget is under 45K INR. Need a laptop with 8GB RAM and I'm confused about the processor (i5 or i3 or AMD). Please suggest the processor I should choose and some laptop models meeting the above criteria.5
-
Coding in the brainfuck learning language is pretty funny. Or a coding challenge not for the smoothest running program but for the longest possible compilation time on an IBM Z processor.1
-
My machine is running on a 6gb ram with an Intel core i5 processor. How the fuck are my .mkv files still stuttering on VLC?!? 😣😣😣10
-
CIA – Computer Industry Acronyms
CD-ROM: Consumer Device, Rendered Obsolete in Months
PCMCIA: People Can’t Memorize Computer Industry Acronyms
ISDN: It Still Does Nothing
SCSI: System Can’t See It
MIPS: Meaningless Indication of Processor Speed
DOS: Defunct Operating System
WINDOWS: Will Install Needless Data On Whole System
OS/2: Obsolete Soon, Too
PnP: Plug and Pray
APPLE: Arrogance Produces Profit-Losing Entity
IBM: I Blame Microsoft
MICROSOFT: Most Intelligent Customers Realize Our Software Only Fools Teenagers
COBOL: Completely Obsolete Business Oriented Language
LISP: Lots of Insipid and Stupid Parentheses
MACINTOSH: Most Applications Crash; If Not, The Operating System Hangs
AAAAA: American Association Against Acronym Abuse.
WYSIWYMGIYRRLAAGW: What You See Is What You Might Get If You’re Really Really Lucky And All Goes Well.
Credit to: http://devtopics.com/best-programmi... -
When I was maybe 3 years old my dad built a PC with a server case, it was huge! The processor was probably something like 386/486 - not sure. I used to play DOS games on it all day long. And the best part is that we still have the PC and surprisingly it still runs!
(Meanwhile I am cloning my secondary 1TB HDD to a 6TB one) -
I'm writing a website for a café and I'd like to use a new tool for generating content and managing it. Only real requirement is a SCSS pre-processor and maybe built-in Auth.
Any suggestions?3 -
Its 5:32 am been working on a new social media website opened up devrant to check whatsup and found out people really have amazing pc setups ♥️ I couldn’t post mine because my laptop has a broken keyboard and i use a usb keyboard also my laptop overheats so i stuck skateboard wheels to two of the back supports its an i5 processor with 3gb of ram but I love it,its what you do with your resources that really matters right ? with that broken keyboard i created a website for my college department which now 720 students and teachers use and we have around 8k entries in 3 days. Running the website on a local server at my college so no money spent on hosting. Taking small steps towards my goals and hopefully one day i’ll publish my setup1
-
When I was 7, my dad bought me a pc. (in 2007). It had some sort of pentium processor and 2 gb ram. I had that pc until I turned 18. We changed the power supply some ports and mouse / keyboard but that's it. I learned all the programming basics, design even animation on that pc. It would take days to render a simple 2D animation. When I finally got the chance to go to a university and move overseas, I bought my own pc. As a full fee paying student, I still couldn't afford the latest and the greatest tech. I somehow bought a core i7, and with 8 gb ram, no bells and whistles. And now I'm almost 21. So when my friends recommend me killer games, I'm sorry fam, call me maybe 10 years later.2
-
I fucking hate ryzen issues with linux
Random freezes.
Added processor max c state in grub and disabled c6 state in bios
Motherfucking os still freezes12 -
As opposed to my horrific experiences with PayPal, Swish, a Swedish (really smooth) payment processor has some really nice documentation. An example:
"The callback, in the happy case, will return an intermediate response with the status DEBITED."
And other nice things such as clear numbered lists describing user flows, with images for extra clarification. Also, they provide full lists of error responses and in many cases suggested way to proceed with these error cases.
And as the cherry on top, this is developed as a cooperation between a few Swedish banks. The banks, who are the most thick type of companies when it comes to IT, does it better than PayPal.6 -
Oh yeah, that one's good.
So it's been at least one year since my Lenovo Y520 (bought about 18months ago) has been throttling its i5 at 800Mhz for no reason.
The thing was so fucking slow, you can't even imagine.
A few months back I found why: the fucking processor raised a PROCHOT flag constantly, even though it never exceeded 60°C.
So now, everytime I boot my computer, I have to run a fucking script to reset this fucking flag. Seriously Lenovo, what the fuck? -
I just bought a new smartphone, cause i broke the one i had before, and i find it really good for only 140€. It has 4gb RAM and 1,5Ghz 4 cores processor, and an IP68 Shock/Dust/Water protection. The only problems that i have are that as the brand is not really popular (model is Phonemax Rocky 1), there are no recovery/roms or easy ways to root it. Maybe if i have time i'll try to port TWRP and LineageOS, but i'll have to do it myself... :(1
-
"I'll buy the new iPhone because of its RAM"
"What amazes me of my new iPhone is its wonderful processor"
"I bought my iPhone because it let me free to do whatever I want"
- nobody, never
Seriously, have you ever seen someone saying that? Why the hell we nerd post things about how much is technically poorer an iPhone if compared with other phones when actually nobody cares about this? Come on, I'd never buy an iPhone as well, but I don't think this is the best way to change the other people' mind, I'm pretty sure that will increase the popularity instead.6 -
~rant
Hey all! M gonna b buying a new laptop for programming.
I need something with like 16 gigs RAM, decent processor, SSD.
I can't buy thinkpad because well... It shud have been ~$750 but in my country, it costs $1200. And that is for the 8GB RAM config... E470... 570 isn't even available.
Hence since the lack of laptops without dedicated GPU but high configs, I basically can find 2 options:
Dell XPS
MacBook
So I wanna ask what would you guys prefer? I code in C/C++ pretty much exclusively. And I definitely like butter smooth functioning of OS.
If it ain't a MacBook, i'll b using Arch Linux.
Finally, I live in india.
So... Which one do I pick? And if u have a recommendation, I m open for that too. It shud just have good specs BUT NO DEDICATED GPU.
Thanks 😄8 -
Here is a weird fact I have been thinking about this evening:
Helio X20 was the only mainstream ARM processor that had 10 CPU cores. It was first introduced in 2015, however no more ARM processors with high core count were used since then..
Nowadays smartphone processors have `8` cores max 🤔🧐
I guess 8 cores the reasonable limit for smartphones. Must have something to do with cost-to-performance factor3 -
!rant
https://github.com/rohitshetty/...
I am a young dev trying my hands around in different stuff.
So I would appreciate any criticism or comments that would allow me to Learn more :) or good practices I can follow.
Here is one project where I tried to create a structured frameworkish way to write mqtt processors.
Mqtt processors are standalone apps that process mqtt requests that has to be acted upon (like add sensor data to db sent from sensor node, read from db, turn some gpio on or off if the app is on some embedded device like raspi ) etc.
This project creates a structure where you can just focus on writing subscribed topic listeners in a clean neat way. (Hopefully)6 -
Someone has a cloud VM running automated attempts to sign up at our website, which is causing the payment processor to block us because of all the suspicious credit card creation attempts, so we get no new signups... I suppose implementing recaptcha is a potential solution/mitigation for this? Do you guys have any other suggestions?12
-
We had an ADAM/Colecovision unit before this, but I don't really count it, as it was more of a console for us than a computer.
In 1986 dad brought home a Tandy 1000 SX. It had an Intel 8088 processor, 64k of memory, and no hard drive. With dual 5.25" floppy drives, our write-protected DOS 3.1 disk stayed in drive A almost all the time. Games and other software were run from drive B, or from the external cassette drive. For really big games, like Conquest of Camelot and Space Quest 3, we were frequently prompted to swap disks in B: before the game could continue.
Space Quest, King's Quest, Lords of Conquest, Conquest of Camelot, Chuck Yeager's Advanced Flight Trainer, several editions of Carmen Sandiego, and at least a dozen other games dominated our gaming use. We wrote papers with WordStar, and my parents maintained their budget with Lotus 1-2-3.
A year or two later, Dad installed a 10 MB hard drive, and we started booting DOS off that instead. Heady days.1 -
Some user profiles I thought were worth stealing for a post:
PonySlaystation
"Full Stack Software Engineer, Electrical Engineering Student driven by OCD & Club Mate."
'club mate' read: probably white powder and ritalin. I heard he once dismembered a horse and put the bloody head in a rivals bed.
uyouthe
"Russian assassin leader, Apple fanboy. Tabs ftw"
Comrade, apple is bushwazee capitalist filth. Onlytrue comrades use windows, because the upgrade is free.
Root
"Magical processor fairy; part-time misanthropic bane of idiots. 🧚♀️🏹 Ergo sum miseriae"
Do you sprinkle magical processor fairy dust in each new generation of chips to increase their
clock rate? -
Got in a somewhat heated discussion earlier a'd wanted to get some more input...
Friend of mine has a community site for a game, and is running adds to pay for the hosting costs etc... He however has recently changed adds provider and now they've become more profitable but also a lot more obtrusive...
I suggested perhaps looking into getting something like coinhive, mining monero coins with your users browsers... He was really averse to it, but I think that it can be viable alternative to adds, as long as you allow your users to not participate and don't go all out with their processors but throttle it to say 5% orso...
Anyhow, he wouldn't have it, and I was wondering if I was alone in thinking I'd rather have some coins mined using my processor than seeing adds, especially if it's not at full speed, and with consent (and not on mobile)5 -
The year is 1999/2000 and my parents bought my brother a PC. Pentium processor, 128MB of RAM and a Sony Trinitron monitor. Sweet machine. Used to play Delta force, Heroes on it. My brother played Civ 2/3, Heroes 3 etc. Lovely times, no worries, no stress, the only pain was the champagne.
-
8gb of ram and quad core processor are not enough for summarizing the game of thrones book series with word2vec.
Learned it the hard way :D1 -
What the holy fuck! Resharper is fucking dog shit! I've never used it before and just had to install for a new job. Visual studio was running great on my machine with 32GB ram and i7 processor. Installed resharper and it just doesn't work. How the fuck does anyone get any work done when it takes literally seconds to register a click! I get it's features are impressive but it means fuck all if it stops me working3
-
I don’t understand why people hate windows, especially windows 11??
Running Windows 11 on a Dell Latitude 3520 with an 11th Gen Intel Core i5 processor, I was able to go from the computer completely shutdown to a new 3D unity project fully loaded and ready to go in only 1 minute and 25 seconds!!! That’s from completely shutdown!
I haven’t overclocked the PC or anything and have used half of the 465GB storage!20 -
One of my supervisors once said: "A computer without mutable state is just a glorified electrical heater."
Meaning that at some level you'll need some mutability.
A processor/memory unit without mutability is not worth very much, except if you want to build a new one for every clock tick...3 -
My work product: Or why I learned to get twitchy around Java...
I maintain a Java based test system, that tests a raster image processor. The client is a Java swing project that contains CORBA bindings to the internal API of the raster image processor. It also has custom written UI elements and duplicated functionality that became available in later versions of Java, but because some of the third party tools we use don't work with later versions of Java for some reason, it's not possible to upgrade Java to gain things as simple as recursive directory deletion, yes the version of Java we have to use does not support something as simple as that and custom code had to be written to support it.
Because of the requirement to build the API bindings along with the client the whole application must be built with the raster image processor build chain, which is a heavily customised jam build system. So an ant task calls out to execute a jam task and jam does about 90% of the heavy lifting.
In addition to the Java code there's code for interpreting PostScript files, as these can be used to alter the behaviour of the raster image processor during testing.
As if that weren't enough, there's a beanshell interface to allow users to script the test system, but none of the users know Java well enough to feel confident writing interpreted Java scripts (and that's too close to JavaScript for my comfort). I once tried swapping this out for the Rhino JavaScript interpreter and got all the verbal support in the world but no developer time to design an API that'd work for all the departments.
The server isn't much better though. It's a tomcat based application that was written by someone who had never built a tomcat application before, or any web application for that matter and uses raw SQL strings instead of an orm, it doesn't use MVC in any way, and insane amount of functionality is dumped into the jsp files.
It too interacts with a raster image processor to create difference masks of the output, running PostScript as needed. It spawns off multiple threads and can spend days processing hundreds of gigabytes of image output (depending on the size of the tests).
We're stuck on Tomcat seven because we can't upgrade beyond Java 6, which brings a whole manner of security issues, but that eager little Java updated will break the tool chain if it gets its way.
Between these two components we have the Java RMI server (sometimes) working to help generate image data on the client side before all images are pulled across a UNC network path onto the server that processes test jobs (in PDF format), by reading into the xref table of said PDF, finding the embedded image data (for our server consumed test files are just flate encoded TIFF files wrapped around just enough PDF to make them valid) and uses a tool to create a difference mask of two images.
This tool is very error prone, it can't difference images of different sizes, colour spaces, orientations or pixel depths, but it's the best we have.
The tool is installed in both the client and server if the client can generate images it'll query from the server which ones it needs to and if it can't the server will use the tool itself.
Our shells have custom profiles for linking to a whole manner of third party tools and libraries, including a link to visual studio 2005 (more indirectly related build dependencies), the whole profile has to ensure that absolutely no operating system pollution gets into the shell, most of our apps are installed in our home directories and we have to ensure our paths are correct for every single application we add.
And... Fucking and!
Most of the tools are stored as source bundles in a version control system... Not got or mercurial, not perforce or svn, not even CVS... They use a custom built version control system that is built on top of RCS, it keeps a central database of locked files (using soft and hard locks along with write protecting the files in the file system) to ensure users can't get merge conflicts by preventing other users from writing to the files at all.
Branching is heavy weight and can take the best part of a day to create a new branch and populate the history.
Gathering the tools alone to build the Dev environment to build my project takes the best part of a week.
What should be a joy come hardware refresh year becomes a curse ("Well fuck, now I loose a week spending it setting up the Dev environment on ANOTHER machine").
Needless to say, I enjoy NOT working with Java. A lot of this isn't Javas fault, but there's a lot of things that Java (specifically the Java 6 version we're stuck on) does not make easy.
This is why I prefer to build my web apps in python or node, hell, I'd even take Lua... Just... Compiling web pages into executable Java classes, why? I mean I understand the implementation of how this happens, but why did my predecessor have to choose this? Why?2 -
Have you ever tried to get something working in node.js... and the version and uninstall and reinstall... hours and hours of dicking around with this god damn raspberry pi zero only to find that node jack shit does not support the "older" ARM 6 processor on the newest raspberry pi zero Wi-Fi units. And omg the amateurs out there with the copy-paste half-assed "help" clogging out the real info. God damn hobby ware.6
-
*The one where he breaks ssh*
TL;DR: Minikube's dick is too big, and my ass wasnt ready.
So there was a time about 2 weeks ago where i wanted to try and set up a minikube cluster using SOP, and that actually went okay, aside from having to move over to a completely different server after discovering that my processor doesn't support virtualization.
So i set it up on my other server, and everything immediately starts going to shit; i can no longer run commands without processor latency. Also top shows 200% CPU usage. Maybe i should stop... NAHHH... so i continue on, and the biggest fuck up was starting up the nginx pods. I have 6 of them, and the moment i try and stand up my custom container which was the WHOLE POINT of this whole exercise, i lose ssh access and cant get back in. I go over to the server and kill the minikube and virtualbox processes, and everything's back to normal.6 -
Anybody has a good recommendation for a laptop for mostly full stack web development?
I think I should look for following features:
- minimum 16G ram
- Althought is 2021, just in case, I add: usb C to connect to a dock with two screens and SSD
- I'll run several docker containers at once
- time to time I make non-exhaustive work on c++
- good screen dpi
- I use linux
- portable. No need for the lighter in the market but easy to carry in a bag. Good battery.
- not too expensive
I can save on:
- I don't need the latest processor, just a good one
- I'm not a gamer. I not need the latest GPU. However, some GPU is appreciated. I don't need colorful leds neither.
Do you have any recommendations on laptops and/or features to search for/avoid?8 -
Sometimes I hate the limitations that being a mobile dev puts on your code, processor, memory, battery, signal and wifi limitations are all things that have to be worked round, however I couldn't imagine doing anything else and it's taught how to write concise and efficient code
-
Folks
I need your input on the following
how important do you think having high core count in CPUs in your daily workflow?
I'm planning on buying a new ryzen 5000 processor, while I am going to game the hell out of it I'm also planning to run wsl2, a ton of chrome tabs, maybe have multiple IDEs for developing random stuff, maybe some virtual machines for some experimentations, some docker containers for some selfhosted software and lastly open demanding games while having everything else open.
Will a 6 core 5600x be enough? or do you think investing in a 5900x will be worth it down the line? (lets say for the next 7 years)
Assume that the GPU will handle the games im going to play and the RAM is going to be 32gb for now11 -
I know am saying this for the ‘n’th time, but...
FUCK ANDROID STUDIO it’s a fucking pile of crap I hate it, after ‘n’ number of years you’d think it will run smoother after This update and that update but It has to pile up its shit on the processor and drag it like painfully in a primitive manner like a dying old tortoise, ..not even making sense. Fuck this.4 -
Just clearing through some of my old stuff and found my first word processor computer, might have to crank this old baby up...
-
I currently have a 3+ year old laptop.
Dell Inspiron 15 3521:
OS: Windows 10 Pro
RAM: 8GB (4+4)
Processor: Intel Core i5 3rd Gen
Video/Graphics Card: AMD Radeon 8730M 2GB (and Intel HD 4000)
Hard Disk: 1TB
It's slowly becoming sluggish and has clearly outdated hardware. I want to pursue a Master's degree in CS (Machine Learning oriented).
Should I consider upgrading? Build a PC instead? Suggestions?35 -
I remember playing games like wolfenstein3D, supaplex, sokoban etc. on our family computer 486 which had as I remember around 100mhz processor. (120mhz in TURBO mode)
Yeah and I did created a few levels in wolfenstein, there was a simple editor.
From programming view I did code my first website only using html and inline css in early 2000s. However internet was a thing of a rich people back then (in my country), so my brother downloaded the whole website with docs and basics of html/css/js for me in collage. My first website was coded on 300mhz pentium2 (or 3?) with no internet connection, took me about a two months to complete and was total mess. But was graphically satisfying with nice gifs which took tens od seconds to download. Main container had 600px width and looked pretty good on my 800x600 resolution.
I still remember messing with BOM signature because of notepad could not save a file without BOM. Leaving all utf8 chars as mess after saving.
Good old times. -
What type of processor will run Visual Studio Community and perform instant no delay compiles? Apparently 3.9Ghz Quad Core with 16GB of DDR4 3600mhz RAM isn’t enough. What will make things snappy?16
-
just heard about an LG fridge with built-in Windows 10Tab:
"The fridge has an Intel processor and 2GB RAM. It will not replace your Desktop-PC!"
//you don't say2 -
My 7 years old computer broke down :(
Finally I have a good reason for buying a new computer. It will have twice as much RAM, SSD drive will be more than 4 times the size of the old one, processor will be 5 generations newer and the graphic card will have 3 times more video memory.17 -
The term "CPU" is stupid nowadays, what even is "central", when there are entire server farms primarily employing GPUs.
I propose "GPP" - General Purpose Processor - as a much more descriptive name instead.3 -
How do you implement TDD in reality?
Say you have a system that is TDD ready, not too sure what that means exactly but you can go write and run any unit tests.
And for example, you need to generate a report that uses 2 database tables so:
1. Read/Query
2. Processor logic
3. Output to file
So 1 and 3 are fairly straightforward, they don't change much, just mock the inputs.
But what about #2. There's going to be a lot of functions doing calculations, grouping/merging the data. And from my experience the code gets refactored a lot. Changing requirements, optimization (first round is somewhat just make it work) so entire functions and classes maybe deleted. Even the input data may change. So with TDD wouldn't you end up writing a lot of throwaway code?
A lot of times I don't know exactly what I want or need other than I need a class that can do something like this... but then I might end up throwing the whole thing out and writing a new one one I get a clearer idea of what i or the user wants or needs.
Last week I was building a new REST API, the parameters and usage changed like 3 times. And even now the code is in feasibility/POC testing just to figure out what needs to be used. Do I need more, less parameters, what should they be. I've moved and rewritten a lot of code because "oh this way won't work, need to try this way instead"
All I start with is my boss telling me I need an API that lets users to ... (Very general requirements).10 -
It’s all a blur but in 5th grade I was using a TRS-80 with a cassette player for storage at the library where my mom worked. Also an Apple IIe at school in the computer lab. My first personal computer was an IBM XT clone with an 8086 processor and dot matrix printer. I bought it after having fun with my cousin’s Commodore 64 and wanting one, but his uncle sold me on the IBM platform as something that I could upgrade over time. I was 13 when I first learned Assembler and BASIC. Big Blue Disk was my favorite subscription software with all the games and other shareware stuff that came every month in the mail.1
-
Human is just computer with meats. There are several subsystems:
0️⃣ low-level hormonal,
1️⃣ mid-level primal instinct and
🅰️ high-level natural language processor5 -
Anything made by asus is a fucking garbage.
My first asus laptop was eee pc 701. It had only 2 gigabytes “ssd” which was just a flash drive using usb 2.0 inside. Doing this instead of using a proper sata ssd is like using bunch of rats glued to a frisbee instead of roomba vacuum. It displayed “intel 800mhz processor” in settings but this was only the processor’s name string, in fact it was only 640mhz single core. I still managed to install totally stripped version of windows xp to it and play some old games but overall it was horrible. It also heated up as hell.
My second asus laptop was infamous z99h. It had windows vista and was slow as hell right out of the box. Also BOTH hinges broke in just like six months. It was horrible as well. The screen itself broke in a year.
I also used asus rt-series router and it required a restart every day just to deliver some wifi. Don’t you dare tell me to “uPdATe fIrMwArE”, I pay money for the product and I expect it to work right outside the box.
Asus smartphones was also garbage.
So why have asus laptop if you can have a real laptop like MacBook or thinkpad? Why have asus phone if you can have a real smartphone like iPhone or pixel? Why have asus router if you can have ubiquiti?
Asus drivers suck, and all of asus software is just bloatware.10 -
From the last 3 years, i have accumulated interest and experience in android dev. Not sure about the future, but that's probably where i will be.
But this fact is moot to our 50 year old grumpy professors teaching 1000 year old rusted computer syllabus, who rejected my idea of a video streaming app as major project, simply because i projected it as a social media app, and "everyone is making a social media app, its such an old topic". yeah right sir, its younger than your daughter that fucks in the lobby
Now we are doing a project on file conversions website, a project suggested by my team member and my good friend. its such a shitty topic, there is no resources available, even the research papers are bad , every search points to a shitty site, and i don't know shit about web dev.
Technically i am the team leader, but my team mate won't let me make the project as android native app, because "Brooo, i am going to make a react app that would be completely offline, completely client side, full secure and shitt small" and sometimes "Bro its my idea" .
Well, 1. the whole point of client side is stupid because the 18 mb jsfile isn't going to get downloaded first in the client's cache(or whatever the process is, idk). The top stack overflow answers i saw told me to buy an ec2 instance and run liberoffice commands on it for every request, and that's SERVER SIDE. even if we could, i am sure its going to be bigger than what i would have made in kotlin.
2. what am i supposed to do? look at you coding while make all the ppts and research paper? you are going to use undocumented libs that "just works" , and i am suppose to curate the theory behind this, looking at all the researches of the world?well i guess okay that's a light job since THERE AREN'T ANY.
And we are targetting all types of conversions, nice. from what i know, handbrake.fr: video conversion s/w = 16 mb. photoshop: image conversion s/w=1gb and ms word: doc to pdf/other formats= 500mb.
Plus all those proprietary and undocumented formats, ugh. Thank you ugly ass companies.
Internet is great but web dev has become a whole lot mess. "I am going to build a software that is going to run in your system only using your device's processor" is a desktop/mobile app, not a website -
What are people unfiltered thoughts on apples new ARM processor? Especially if you are a Mac user?
(I'm an OS hopper, though my main machine is running ubuntu right now, and work machines are usually mac with windows with wsl)20 -
The human brain (also animal brains, even ants) are incredibly complex. Each neuron is now supposedly its own processor. So a human brain is a complex network of billions of processors, not just threshold variables. This means to simulate an organic brain sufficiently it will take a huge computer system with billions of parallel processors. Now, I don't know if the sophistication of a computer processor is represented in each cell. So this may not be equivalent to billions of pentium cores for instance. However, it still presents a huge challenge for AI, as it exists now, to replicate. My thoughts are that AI that is silicon based will take a different approach that leverages how computers work. My guess is that current neural net models are not a good match for this unknown AI. Will it inherently exhibit pattern matching like an organic brain? Or will it be a different kind of consciousness altogether? Will we even realize it is self aware? Will my roomba plan to kill my pet for my attention? What are some other models being employed in AI research?3
-
I need your help please !
I'm about to buy a netbook to use it as a portable dev environment ; it should be able to run Eclipse, some code editor, GCC, and a virtual machine.
I've found a Lenovo E135 with 8GB RAM for only 100€ but the main problem is that his processor is an AMD E2-2000 (only 1,75GHz, and AMD)... Will it be enough to do what i want ?
I've also found a Lenovo Thinkpad x230 with a Core i5 (2,6 GHz) and 4 GB RAM for only 170€ but the problem there is the battery (the core i5 consume much more power)
Which one should i buy ? (Knowing that i have only 100€ but i could manage to get the missing 70€ for the x230)
Thanks for your help !10 -
So far I've been pretty lucky... except for the code some of my professors at uni used in their assignments. A couple of them had this horrid habit of giving you a horribly-written, out-of-date (we're talking these chuckle heads used the same code for years on end and wondered why it didn't work on new versions of Java), messy source file with "fill in the blanks" sections like it was some kind of Java Mad Libs book. One of them had an entire jarchive of data structures we were required to use that he'd written in the '90s and NEVER UPDATED. Another one had a script he'd written for his own specialized assembly macro preprocessor that he'd been using without update for who even knows how long. Now, we were using one of those goofy virtual machines with its own simplified assembly language, and we were on the fourth version of the program. This guy'd written his macro processor in Java for the second version, never updated his Java source, only provided a barely-working .bat script for running it, even though the department's official preference was a *nix environment, and implemented this horrid "pretty-printer" that had a regrettable little habit of eating code. You heard that right. You'd run build.bat and it'd expand your macros then send it over to the pretty-printer which would very infrequently just replace the existing program file with an empty file. When we brought it to his attention, he goes "...huh. never happened to me." and proceeded to use the very same set of programs for the next three semesters, even when the assembly simulator was updated again. I heard wails of anguish from the poor sad souls that came after me as their macro processor created program files with deprecated operations, their pretty printer printed out beautiful, perfectly-organized empty files, and the professor responded to every second of a student begging for an updated version with "...huh. never happened to me." I never saw a single bug reported to either of those professors even acknowledged, let alone fixed. Some of the Java Mad Libs were the same ones they'd started using when they first switched the curriculum from Ada to Java. Thankfully after my first year I escaped into the bliss of the next three years, which were full of *nix and C and beauty.
-
This is the single most important question of the year:
Where is a good Minecraft server service that can run FTB Modpacks?
My options are:
1 - A dedicated linux box with Minecraft management software with a port open to the world.
2 - Purchase a vps or something similar to host MC.
I have done 1, and it worked pretty well. It ran on a tiny A8 processor with 8GB ram and an SSD. I see services and they cost like 15 to 20 a month and seem like they are awfully stingy on storage. I can get an enterprise server for 30 a month, but I just asked (my webhost, who I really like) and they said they cannot run Minecraft servers. They said I would need a vps, and they don't have them yet. So I could dig around for a vps service and that is an option. I am really wary of "minecraft" branded services as many are outright ripoffs.
Thoughts? Successes? Just do it myself?5 -
Both laptops have the same price.
which one you think is better ?
(already used) MacBook Air 250GB SSD Nvidia Graphics 4GB RAM Intel Core i5 Processor
(brand new) HP Pavilon 1TB HDD Nvidia Graphics 12GB RAM Intel Core i7 Processor.30 -
Thinking of building my own lab server for core development work. 16GB RAM/Core i7 processor on my mind....What specs work for you?3
-
My morning productivity so far: compiling some code while a full-length show I edited renders. (Not pictured: the machine learning model I'm also training.)
I pity my processor.4 -
i am having a feeling that getting into software branch of it industry might be a wrong decision. in my college years, i got to explore different domains in tech :
1. software development : frontend tech , backed tech, mobile tech : somethings i and a million other people know
2. os and internal softwares : os, compilers, processor coding , chip manufacturing etc : don't know what this industry is known but we devs rarely go that deep in the hole
3. the network industry : computer networks , topologies, packets, data transfers etc. again not sure what this industry is but 4g/5g brands/ cisco seems to making a lot of money with this
4. cloud computing, devops, data etc : i guess some backend devs explore this domain too.
5. ai/ml data sciences/web3 : the new fad
6. biotech :?? don't know anything about this at all
7. graphics/management/qa : the other associated sisters of software dev. they are seeing a similar recession
8... ans so on.
i chose the 1st one in my undergrad as my career and now regretting this i am thinking of doing masters to fix my mistake and take a job in some other industry that is still blooming and has a future for sustaining a recession for atleast 30 years.
so any suggestions/experiences?8 -
I just completed this heartfelt and sincere little cry for help on another ste but it wasn't verified because I'm not special enough to format it like a PAD, whatever that means. I cannot seem to simply burn music files anymore. I'm using a Samsung laptop Device name DESKTOP-AII2T2S
Processor Intel(R) Core(TM) i7-2675QM CPU @ 2.20GHz 2.20 GHz
Installed RAM 8.00 GB
Device ID D766A89B-5671-4D9F-B6F9-2D884E9EA309
Product ID 00326-10000-00000-AA880
System type 64-bit operating system, x64-based processor
Pen and touch No pen or touch input is available for this display
Edition Windows 10 Home
Version 20H2
Installed on 09/08/2020
OS build 19042.928
Experience Windows Feature Experience Pack 120.2212.551.0
The music is a combination of commercially relased material as well as bootleg recorded material.
I am not looking for a "This is Why We Can No Longer Burn Our Music Files" Intro. All you need to tell me is the corporations that eat the world are protecting their copywrighted music and I must be up earlier and eat bettter breakkie than those individuals. That I can handle. Although I'm not a dev, I'm sure you can understand the feelling after you have worked for hours on attempting something, only to discover your effort has been in vain (much like my former relationships). Again, if you can give me any direction aside from hanging it up and attempting to find happeniness elsewhere, sock it to me. I deserve it. Thanks.
11 years ago when I used a Macbook putting together a playlist, inserting a blank CDR, and burning the file onto the CDR was very easy. I\'m am now faced with hurdles I sometimes scale, only to fall on my face.
I\'m not stupid, or uneducatated about flac, blah blah. I learnt it all myself. I\'m now using a windows operating system. Afew weeks ago I was able to burn what ever I pleased and it was OK.
Then one day, it just wouldn\'t do it. I was following no altered procedures. Since then it\'s been misery. I remember that ocenaudio once burned music files for me.
I don\'t know how to go about retrieving an instruction manual that will take me step by step as to how to do this.
You help would be appreciated.
Cheers,
Jonno
I've been lurking here since 2017 when my Macbook died. I've always enjoy the level of sanity and have attempted to add my jaded, distant and nihilistic spin on a few threads. It won't destroy me if I can't burn files anymore, I'll just go back on heavy tranques and change my name to Ben Zo. Dia Za P.een3 -
Screw our credit card processor so hard. The powers that be decided to sign with them because their rates were better. That's it. Never mind the fact that they don't make/work with mobile readers, which we need. Never mind the fact that their app is trash and is lacking basic features. Never mind the fact that their support is non-existent. Never mind the fact that when I request a new POS machine, I don't hear back for 6 months, and have to follow up again only to find they forgot about it. Never mind the fact that their POS machines can't handle 2 merchants like our ancient, "out-dated" one could, and so we need to spend double the money and have 2 POS machines sitting on the counter. Never mind the fact that their website is trash and lacks basic functionality. Never mind the fact that I cannot manage our user list (which changes CONSTANTLY), or even VIEW IT. I need to email them for all of this, and they may or may not respond. Never mind the fact that I'm going to spend my entire Friday scrolling through thousands of transactions, looking for one specific one, because their website doesn't even allow me to search for a specific transaction amount. Never mind all of that. Slightly lower rates, baby!1
-
Do you people anything about a processor with a firewall?
Because I just saw an episode of Arrow where they were trying to break into such a firewall!1 -
Working code?
Or fake compiler?
Fix a problem?
Or buy a new computer?
Bring a flash drive?
Or bring a hard drive?
Use water cooling?
Or use an ice cube on top a processor and memory?
Drink some coffee?
Or eat a healthy breakfast?
Do you make hardware?
Or software?
These are the problems programmers face from old people as employers or relatives trying to find something to relate to. -
Hey guys, I have a low spec machine with these specs:
32 GB SSD
4 GB RAM
Intel Pentium processor 1.66 GHZ 64 bit
I want to install a lightweigh Linux distro, could you recommend one.
P.S: Currently I am thinking of Bodhi or ArchLabs.4 -
My manager and I setup Cloudflare for one of our websites because we’ve noticed bot activity. Stakeholders have their feathers ruffled because ONE fraudulent payment got through during the first 24 hours of using Cloudflare. Um, there’s no miracle solution and we didn’t promise you miracles.
Manager and I aren’t sweating it because 1) we’re still learning Cloudflare, 2) we’re still familiarizing ourselves with the website because it used to be maintained by an outside agency, and 3) things were much worse a few months ago before any mitigation efforts were put in place. We finally setup Cloudflare because the fraud tools for our payment processor could only do so much.
We’re both honestly surprised a situation like this hasn’t come up before in all the years the website existed.4 -
I hate Visual Studio. It takes 2 minutes to load- it is a glorified word processor- sure it has compilers but that's the backend work shouldn't be doing any of that at the start. It should take 10 seconds to load. It has absurd number of annoying checkers like unused attributes or function not initialized (constantly hits on Start ()in monobehavior) I even told to skip that nonsense and it still does it. It doesn't however check like infinite loops easily accidentally do. You can't duplicate a script because it can't do the simple thing of ensuring file name and class match.7
-
Just typed this into the Python interpreter and my whole system just froze. Guess I have to do a force shutdown.
x = list(range(1, 999999999))
So is there a way you can somehow configure your linux system such that the window manager/system is never out of memory or processor time? So that atleast I get can atleast kill the process which is freezing the system.3 -
Shout out to my fellow not American programmers that are constantly getting the spelling of colour wrong. Constantly having you word processor nag at you because you spent too much time programming and are spelling it with a letter U, or constantly having your debugger nag at you because it doesn't know what "colour"4
-
In banking industry it brings up security concerns. We were in the exact same situation, however using SAS+SPDE with some custom SAS and tsql queries. Our database was merely 100TB, still it was a nightmare to assure stable performance thoroughly, because SPDE could not properly handle SMT. After having 24h++ daily flow processing times, the managers have decided to rent a 6 years old IBM power 7 with dedicated processor cores, which eventually have truncated the processing time down to 15 hours. This was a time limited contract, for 6 months. I've left the company in a short while, but this made the managers to rethink buying a more up to date server, so now the daily processing flows now are around 11,5h. Long story short, sometimes a little architecture optimization does the trick.
-
C- let's See
C is a procedurally developed language follows sequential method of solving a problem.
Example
If a teacher of an Institute teaching various subjects, Maths, English, Science and History.
Case1.One student comes and asks teacher to teach English
and next student to teach Maths,
And the other to teach History.
Case2.Next students comes for English
Case3.Other one for History.
So what I understood regarding C is procedural language is
It completes first case1,next case2, and then case3. (Task after task)
Here English is taught 2 times seperate
And History too 2 times separately making time and process complexity.
C is a platform based high level language support only desired platform. If I program in windows with i3 processor , it runs only on the same OS and Processor, if code is run in other computers.
Single threaded, if a code is interrupted in between, stops there and doesn't allow other part of the code to run.
Java
In this if the same above cases encountered then and tell
Computer to create a Class of English and tell all the students to attend the class(time saving, No complexity and not repetitive)
Same way Creating History class and make all students attend the class at once.
Students may be the objects created.
Multi threaded language, if a task is interrupted following code cannot be stopped. Allows other part of thecode to run.
JVM- Java virtual machine allows Java code into signs that can be understood by computer. Where as C converts into binary code.
A class concept added to C language become C++rant support rant learning to code want to code jvm newbie asking high level languages are cool discussions java c mistakes3 -
Adding opportunistic move to a large recursive tree processor is not a fun exercise, I would advise anyone who intends to dabble in interpreters to design with opportunistic move from the get go.2
-
Related to the project in my last rant...
Project got delayed for about a month in total because the API for the payment gateway wasn’t allowing charges against stored cards. Could save, modify, and delete them, but no charges.
After a week of trying to get things working based on the documentation, I get in touch with the vendor (great people) who file a support request with the people running the processor so we can see what’s up. Long story short, that amounted to 3 weeks of getting ignored until the vendor raised hell on my behalf, only to get the following reply back:
“You’ve been using the dev credentials, try it on live transactions instead!”
Thankfully, we’re able to move the customer to another processor under the same vendor, where I already have all the requests figured out...2 -
How do you think about unit testing/TDD when writing apps? (I'm working this at 3am so might be a bit messy... Just a thought I woke up to).
Whenever I write an app, I don't write unit tests but as I'm developing I may create test functions for specific parts that I run to validate a specific component is working before moving onto the next.
So first, when I get a problem, break it up into components based on the requirements. It's usually sort of input, processor, output sequence.
Where the processor is essentially the core app. And so I start coding it, referring to the input thru an interface, model objects, adding fields as I go along (assume no matter what the input, I will get these before the logic is called). I may add some more interfaces as well for other data I may need but I know won't be going in the first input.
So I write all the logic, functions needed to get a basic app to run that does what I am writing the app for.
Only then do I write a test functions passing in different parameters to make sure the logic and response is what I want and making fixes as necessary. At that point I basically have the simplest version of the app.
(I guess this is sort of like mocking?)
Then build outwards implementing and testing components as I go along and may do some simple refactoring/redesign. (I guess all these tests are functional then, have to start the whole app).
And finally when I have the basic requirements fully complete I will add the "nice to haves" on top via refactoring of specific logic in specific components. Again testing by running the app maybe with simple inputs.
I guess now I'm thinking how do you write unit tests/TDD if the app keeps changing (via adhoc refactorings) as you are creating it? -
So my processor fan will no longer fit in my mobo, caused by 3 out of 4 stands broken. I don't want to fry my CPU, so won't be using it until the replacement arrives.
Time to experience a different life for time being?3 -
Know that feeling when you break out of a really deep relationship, end up with someone else who's prettier and better in many ways than your ex yet still, during those intimate moments, you can't help but think about your ex and how good life was with them?
My previous laptop even though it had the same processor, lesser RAM, was less prettier and the SSD being the only upgrade to my new one, somehow worked faster, smoother and felt a lot better. :( -
Fucking hate to explain basic shit to computer illiterate. Usually I don't mind, but right know I working on the project, want to automate one thing I need to do every morning, put two numbers to web page(I will explain details maybe in next rant). So I am only one who fix, buys computers, printer(for some problems I call for other repair man.). Generally speaking working as IT guy. Firm has like 50 computers, some of them has SCADA software. Some computers have Win 7, some win 8 and others win 10, can't upgrade those computers, not enough money(I can deal with this problem). And yes, computer buying is not the fastest, easiest thing too. Because is public firm, I need to do public buying(I don't know how to translate to english), and most of the time wins the lowest price, I am ok with that. But I can't on item specification write I want that model pc or it components. Example: I can't write I want intel processor, however I can write number of cores, frequency. But it's not that bad, usually i have template for all things I buy. One of the worst thing is this, our firm bought new bookkeeping software version, old version was using visual foxpro framework. Good thing I didn't initiate the purchase, because right know I would be jobless, not because I would be fired, but because our senior accountant would drive me crazy. In fact accountants drive me crazy, but I can handle it for now. As I wrote before our form has about 120 workers, major part of workers are old, like my parents age. (I am 28 btw. Mom is 55.). As you all know what happens if you say you work with computers. So our accountants are like 60 years old, got new program, don't know how to work with it, and they ask me how to do certain things. if I don't know how to I ask program's support, every question is like 90 Eur. So in short accountants expect I should know their work and how program works. If I try say something they don't like, they try to make my day hard. Next thing is our billing program. Man that worked before me done some payments import. And when I came everyone expect me to do that. Ok I did that because that people working with billing program would probably fuck it up. And I semi automated that, so I don't mind that much. Sometimes that program fucks up, like it happened yesterday, it send email invoices attachment without filename. Example: people got this attachment ".pdf"(no filename, only extension), And if you save it you need do OPEN WITH command and then select pdf reader or rename file (I don't know what easier). And surprise surprise our firm, customer support redirects all phone calls, emails to me. But I did explain to customer support what to say to people. Still they redirect it to me.
PS: This is my first job after school. I work as part time.
TL;DR Thinking my life, carrier choices. accountants are not the nicest people.8 -
A new version of a service request queue processor was deployed to test. It was supposed to have a performance boost compared to the previous version. The performance is so good that the new transactions are still locked when SRQP attempts to process their workload, so it ends up issuing empty transactions in another module. Artificial delay time...
-
The combination of an AIO cooler and liquid metal heatsink paste and a processor that isn't broken has really done wonders for my productivity.
An unexpected side effect of having a machine that can operate under load without instantly blue screening or crashing is that the room it is in turns into an oven pretty quickly. -
My current computer is starting to overheat playing games that used to run fine. Like thermal shutdown I think on the GPU. So I need to have the thermal compound redone. Then I will be out a computer while this is being worked on. I would have a local shop do it, but would rather a vendor that sells thousands of computers do this.
I am thinking I might buy a newer computer and then send the current one in for repair. Then let my kids use this one when fixed. I just got this one where I wanted it disk and setup wise. I also would be stuck with Win 11 on the newer computer. Everything I do on my home computer is windows centric. The people I support, the stuff I run, everything. That is the biggest negative in getting a newer computer. I would rather buy one with windows 10. But they won't even sell that anymore.
The newer computer is much nicer though. 4070 over a 3060. Newer processor with 32 threads. Thinking of going 17" screen instead of 15". So I like the idea of the compile times being faster.
I am loathing the idea of setting all my programs back up again. This sounds like a nightmare. I don't even like thinking about it.
Fix the old reliable thing and/or get the new shiny toy. What bothers me is that it has only been since 2021. I don't remember having to redo heat sink compound on older computers. I keep reading this is a thing. Wtf is happening to the compound? Is it made to fail?9 -
Hello guys,
Does anyone use a Samsung Galaxy S7 with a custom rom? Are there good roms? And can you recommend some?
I have the european S7 I think there are processor differences. -
!dev but a parable
I worked at a Walmart Photo Lab with a Fujifilm photo processor. I had a guy ask for his pictures but they weren’t printed, I could see his order but there was no “payload” ( think PO header with no PO lines). He said he ordered 600+ pictures off his SD card, then blew them away because they were ordered.
As I had no physical pictures, there was nothing I could do but say “sorry”. He was mad, but there was nothing I could do.
Moral of the story, verify backups before wiping the system. -
OK. We've got this tiny little pet project of mine (work related)…
I rescued it from the git archive, simply put: someone hot glued an elasticsearch scroll + document processor (processing) together.
After a lot of refactoring, I had an simple, much improved (non-parallel) Akka Worker System without an Akka topology / hierarchy.
I left out the hierarchy at first, because I didn't know Akka at all.
I've worked with a lot of process workflows, and some systems that come very close to IPC, so I wasn't completely in the dark.
Topology requires knowledge / creation of a state machine / process workflow. And at that point of time I just had... Garbage. Partially working garbage.
I finished yesterday the rewrite into several actors... Compared to before, there are 8 actors vs 2... And round about 20 classes more. Mostly since I rewrote the Receive Methods of Akka as Command DTOs... And a lot of functions needed to be seperated into layers (which where non existent before)
Since that felt more natural than the previous chaos of passing strings or other primitive types around, or in the worst case just object....
(Yes: Previously an Actor was essentially a class with one or more functions "doEverything" and maybe a few additional functions which did everything - from Rest Client to Processing)).
Then I draw the actual state machine based on everything I've written in the last weeks and thought about how to create the actual topology and where / how parallelizing might make sense.
Innocent me stumbled in the Akka Docs on Akka Typed... (Didn't know it existed, since I'm very new to Java and Akka).
Hm, that sounds an a lot like what I did. In an different way, yes. But not so different that it might be VERY hard to port to.... And I need to change (for implementation of hierarchy) a few classes....
[I should have known at this stage that my curiosity would get the best of me, but yeah. Curiosity killed the cat.]
Actually the documentation is not bad. It's just that upon reading the first more complex examples, my brain decided to go into panic state.
The've essentially combined all classes in one class in all source code examples [which makes sense more sense later], where it is fscking hard for an chaotic brain like mine to extract information....
https://doc.akka.io/docs/akka/...
The thing is: It's not hard to understand… actually very simple.
It was just my brain throwing an fuck you tantrum.
So I've opened more examples in other tabs and cross referenced what happened there and why...
Few frustrated hours later I got that part.... And the part why it's called Akka Typed. It was pretty simple....
Open the gates of hell, bloody satan that was too easy for fucks sake.
Nooooow.... I just need to port my stuff to Akka Typed.
Cause. Challenge accepted, bitch - eh brain. You throw tantrum, you work overtime. -.-
I just cannot decide wether to go FP or OOP.
Now... I'm curious wether FP is that hard... Hadn't dealt with it at large before.
Can someone please stop me... I'm far too curious again. -.- *cries*6 -
Question for all you Java devs out there (and a story). We have a customer that we interfacing with their machine. They sent us a simulator that runs in Windows written in Java. I did try running it in Linux, but the JVMs I have installed did not like it at all. Complained about missing deprecated stuff. I start up their sim in Windows and find it is opening between 50 and 100 network ports. Then I tell the simulator to run inside the gui for their program. It pegs all of my processor cores at 100%. I have pegged cores in software before, then corrected my errors in my code. However, I have never pegged all the cores at once. I am kind of in shock.
So, how hard is it to peg all the cores in a Java program? I assume they have a thread on every port or some nonsense?6 -
1997 Olivetti, 122 MHz Intel processor, 8MB RAM and 1GB HDD and Win 95. I mostly used software for children learning and games.
But my first “computer” was a shoebox with a keyboard drawn by hand on the cover and a screen on the bottom where I could change the “software” by swapping different drawings inside a transparent envelope.
All hardware and software made by me 😁 -
i5 vs. i7?
My old laptop is dying and I've started to look for a replacement. I am used to have an i7 processor, my dev env consists of either vagrant and a VM or a docker-compose env. Plus I use JetBrains IDE. Mostly I do webdev, so not that much compiling going on, however often I need a VM with Windows to have Photoshop or Illustrator running...
So I guess the question is: Is there a notable difference between an i5 and an i7 for development?4 -
I was using Ubuntu with gnome environment.I changed to LXDE desktop since my laptop became very slow.It is lightweight than gnome..But I don't like it..I want to try some other distros with lesser GUI things..
Give me some suggestions..some programming friendly distros(light weight,less GUI things)..???
My laptop has a i5 processor and 4gb ram(5 years old)..13 -
Hi devRant. Wanna rant with some shit about my company. First some good parts. I work in company with 600+ employees. It's one of the best companies in my region. They provide you with any kind of sweets(cookies, coffee, tea, etc), any hardware you need for your work (additional monitor, more ram, SSDs, processor, graphics card, whatever), just about everything you need to make your work faster/comfortable. Then, we have regular reviews (every 6 months), which rise salary from $0.75 to $1.5 per hour. (I live in poor country, where $15 per hour makes your more solvent then 70% of people, so having 100-200 bucks increase every half year is quite good rise).
The resulting increase of review depends on how team leader and project manager are satisfied with my work. And here starts the interesting (e.g. the shit comes in).
1) Seniority level in our company applies depending on the salary you have. That't right. It does not depend on your skill. Except the case when you're applying to vacancy. So if you tell that you're senior dev and prove it during interview, you'll have senior's salary. This is fine if you're just want money. But not if you love programming (as me) because of reasons bellow.
2) You don't need to have lots of programming experience to be a team leader. You can even be a junior team leader (but thanks god, on research projects only). You start from leading research projects and than move to billable if the director of research department is satisfied with your leading skills.
As a consequence our seniors are dumb AF. This pieces me off the most. Not all of them. A would say half of them are real pro guys, but the rest suck at programming (as for a senior). They are around junior/middle level.
I can understand if guy has $15 rate but still remains junior dev. That's fine. But hell no, he is treated as a middle, because his rate is $10+ now! And his mind has priority over middles and juniors. Not that junior have lof of good tougths but sometimes they do.
I'm lucky to work yet on small project so I'm the only dev, and so to speak TL for myself. But my colleague has this kind of senior team leader who is dumb AF. They work on ASP.NET Core project, the senior does not even know how to properly write generic constraints in C#. Seriously.
Just look at this shit. Instead of
MyClass<T> where T: class {}
he does this:
abstract class EnsureClass {}
MyClass<T> where T: EnsureClass {}
He writes empty abstract class, forces other classes to inherit it (thus, wasting the ability to inherit some useful class) just to ensure that generic T is a class. What thA FUCK is wrong with you dude?! You're a senior dev and you don't even know the language you're codding in.
And this shit is all over the company. Every monkey that had enough skill just to not be fired and enough patience to work 4-5 years becomes a senior! No-fucking-body cares and reviews your skill increase. The whole review is about department director asking TL and PM question like "how is this guy doing? is he OK or we should fire him?" That's the whole review. If TL does not like you, he can leave bad review and the company will set you on trial. If you confront TL during this period, pack your suitcase. Two cases of such shit I know personally. A good skilled guy could not just find common language with his TL and got fired. And the cherry on top of the case is that thay don't care about the fired dev's mind. They will only listen to reviewer. This is just absurd and just boils me down.
That's all i wanted to say. Thanks for your attention. -
This conversation happened yesterday between me and a client:
"I need drivers for my Hackintosh build, macOS doesn't find and install the graphics card and the NIC"
I laughed at first, then i asked him his configuration..
"I have a 6700K and an MSI M3 H170"
Okay, things are getting worse here...
"So do you know the difference between H170 and Z170?"
"Yes, I have bought an H170 one because i do not overclock"
"Then, tell me why the f*ck you have bought a K processor if you do not f*cking overclock..."
"Because is more powerful"
F*ck no, he didn't said that... -
I'm fucking tired of my computer having random
2 seconds latency on any basic action and being slow as fuck regardless of powerful processor, ssd and 32GB RAM. Music via bluetooth is basically unusable since every few seconds the music stops for a 0.2s then plays again. I installed this system (opensuse tumbleweed) in February this year and it's just sad that I have reinstall again (any ideas for distro) ?
I made a dummy mistake of buying a CPU without internal graphics and this resulted in having to buy a GPU. So I got myself Nvidia(another mistake) since i though i would be using CUDA on the university. Turnes out CUDA cannot be installed for some retarded reason.
With Nvidia GPU the screens on my two monitors are swapping every time I use a hdmi switch to use other computer. On AMD GPU this problem does not exist. AMD GPU pro drivers are impossible to install. Computers barely fucking work, change my mind. Shit is breaking all the time. Everything is so half assed.
The music player that i use sometimes swaps ui with whatever was below it like for example the desktop background and i need to kill the process and start again to use the program. WTF.
Bluetooth seems to hate me. I check the bluetooth connected devices on my computer, it says headphones connected. BULLSHIT. The headphones are fucking turned OFF. How the fuck can they be connected you dumbass motherfucker computer. So I turn on the headphones. And I cannot connect them since the system thinks that they are already connected. So I have to unpair them and pair them again. WTF. Who fucking invents this bullshit?
Let's say i have headphones connected to the computer. I want to connect them to phone. I click connect from the phone settings. Nothing happens. Bullshit non telling error "could not connect". So I have to unpair from computer to pair to phone. Which takes fucking minutes, because reasons. VERY fucking convenient technology.
The stupid bluetooth headphones have a loud EARRAPE voice when turning them on "POWER ON!!! PAIRING", "CONNECTED", "DISCONNECT". Loudness of this cannot be modified. The 3 navigation buttons are fucking unrecognizable so i always take few seconds to make sure i click the correct button.
Fucking keyboard sometimes forgets that I remapped esc key to caps lock and then both keys don't work so i need to reconnect the keyboard cable. At least it's not fucking bluetooth.
The only reason why hdmi switches exist is because monitor's navigation menus have terrible ui and/or infrared activated, non-mechanical buttons.
Imagine the world where monitors have a button for each of it's inputs. I click hdmi button it switches it's input to hdmi. I click display port button - it switches to display port. But nooo, you have to go through the OSD menu.
My ~ directory has hundred of files that I never put there. Doesn't feel like home, more like a crackhead crib.
My other laptop (also tumbleweed) I click on hibernate option and it shuts down. WTF. Or sometimes I open the lid and screen is black and when i click keyboard nothing happens so i have to hold power button and restart.
We've been having computers for 20 + years and they still are slow, unreliable and barely working.
Is there a cure? I'm starting to think the reason why everything is working so shitty and unreliable, is because the foundations are rotten. The systems that we use are built with c, ridden with cryptic abbreviated code, undefined behavior and security vulnerabilities. The more I've written c programs the more convinced I am, that we should have abandoned it for something better long ago. Why haven't we? And honestly what would be better? Everything fucking sucks. The rust seems to be light in the tunnel but I don't know if this is only hype or is it really better. I'm sure it can't be worse than c or c++. Either we do something with the foundations or we're doomed.22 -
PHP Devs quick question,
I have a project using PleskPHP5 (or that's the processor I found in config)
I'm a bit familiar with Laravel though I'm not sure, are they different? How to migrate to Laravel?
And the project was developed on Windows and I want to use it on Linux do I need to change anything?
I'm a complete noob in php but I want to learn with that project11 -
I love everything about the Nvidia Tx2 board...except the ARM64 architecture. Catch me constantly building shit from source :(
Seriously though, I wish there was just one, universal open source processor architecture standard to rule them all.1 -
Just when I thought that surely a single SIMD dot product instruction must be the fastest way to calculate a damn dot product on a processor, agner fogs instruction tables come flying out, hitting me over the head telling me that manually calculating a dot product may actually be faster sometimes
Why must computers be like this?
I just came out of a bad relationship with hardware rasterization being horribly slow and now I can't even trust my processor to do things properly
This is how people develop trust issues1 -
Found my grandmother's Vista laptop from '07. It has 3 problems:
1. Battery lasts 15 seconds at full charge (not a shock)
2. Plastic shell around screen is cracked (not badly, likely fixable with plastic welding)
3. IT'S SLOW AS FUCK
Point 3 is strange, considering the specs:
AMD dual-core CPU at 1600MHz (better than MY laptop...)
Some NVidia onboard GPU thing (no clue)
2x 512MB DDR2-L (immediately swapped out with 1x2GB and 1x1GB, the most we have at the house)
What I noticed, though, is that it's got a SOCKETED CPU (a rarity only like 3 years later), specifically an S1 socket. There's an AMD Phenom quad-core clocked at 1800MHz with that socket, for $16 on eBay no less!
However, it's a different processor family, so I don't know if it'd work in the board...
Anyone think it'd work?9 -
So are game consoles also affected by these processor problems? Many articles just mention PCs and phones5
-
Why do we still use floating-point numbers? Why not use fixed-point?
Floating-point has precision errors, and for some reason each language has a different level of error, despite all running on the same processor.
Fixed-point numbers don't have precision issues (unless you get way too big, but then you have another problem), and while they might be a bit slower, I don't think there is enough of a difference in speed to justify the (imho) stupid, continued use of floating-point numbers.
Did you know some (low power) processors don't have a floating-point processor? That effectively makes it pointless to use floating-point, it offers no advantage over fixed-point.
Please, use a type like Decimal, or suggest that your language of choice adds support for it, if it doesn't yet.
There's no need to suffer from floating-point accuracy issues.26 -
//not a rant
In Semptember I will be attending information technology at university, and I was wondering what kind of laptop should I buy:
optimal screen size? (13.3 or 15.6)
ram? (for VMs)
processor?
dedicated graphics? (or should I just stick to integrated)
I will be mainly coding, running VMs and school stuff(maybe some casual game like League or Minrcraft), so I was wondering if you guys could help me out with the decision18 -
I'm trynna buid a new pc for data related works. I already have i7 8700k processor with decent cooler in Asrock z370 extreme 4 MoBo and 16 gb ram. I cant decide between RX 5700XT and RTX 2080 Super.
Please help. Your valuable opinion is highly appreciated.5 -
!long_rant
So, my previous rant was one of the most liked rant for me so far, in which I simply stated I am moving to Ubuntu.
I feel ashamed to say that I am moving back to Windows(for now at least).
The reason is not Ubuntu, I actually found it lovely. Its my friggin laptop. While there are many other problems, but the biggest is that the wifi adapter's range has fallen drastically in linux. On Windows it was working fine, I also checked Kali and tails(live) and both show the same issue. My router was literally 3 meters away and this shitty laptop couldn't connect. Since, it connects on Windows properly, its obviously a driver problem, the shitty creators haven't released an API for linux devs. Now, I have tried a lot of suggestions from SO but if I can't find a solution I will either have to arrange another laptop or simply go back to that always updating OS. There is another problem, the graphics card is not even being recognized. No idea how shitty hardare this laptop has.
Btw its an HP ac110tx.
Tell you what, I recently bought it from Paytm and thought its an i7 5gen processor with a graphics card for a reasonable price, what can go wrong. This is the issue. Hardware was shitty. I hate this laptop. If I had the money I wouldn't think twice before dumping this laptop on ebay/quickr. But, we are struggling devs(at least I am), don't have a big amount coming soon.
Very depressed and angry!13 -
So a few months ago a broke screen of my laptop, currently I quite broke so I can't change screen and for some time I was using TV as screen, but ofc. Windows have to crash or do similar shit and know it doesn't send signal via HDMI, probably it's showing some info, but signal is only send when it boots windows or something.
So my girlfriend give me her old laptop (4gb RAM and I3 processor, bit touchscreen :/) and windows aren't updated for quite a long (it was still windows 8) and I tried to update it. Ofc it has to be problem, DISM doesn't work, downloading iso doesn't work, fml. I guessed I have to live with that, but later disc usage starts to be around 100% and freeze for few minutes (shitty Win2k PC at uni was more responding). Then I try to refresh windows, DISM starts working, updates semi-working. I left with 21 updates with error and there starts conversation:
Me: install 21 updates
Win: kk. Or actually no
Me: please
Win: the best what I could do is 8.
Me: it's something
Win: actually fuck it, only 4
Me: I'm done *typing Manjaro xfce*
So now I have dual boot with Manjaro which use 40% ram with Firefox open, when windows has 30% alone. I can't play anyway and DF is on Linux so fuck Windows.
I am noob when it comes to Linux and everything actually, but it makes me want to learn and improve.16 -
Hey can anyone help me out in building a hackintosh mojave. I have intel 6200u skylake processor in hp 15ab522tx laptop. please share links and resources.5
-
So here is something that I haven't yet figured out how to do
How can I automate VPN connection and call some APIs through this connection?
I have something here where I have to turn on something like tunnelbear, connect to an American node, do an API call, then turn off the VPN, I just can't figure out a way to automate these steps without doing UI automation, and I don't plan on doing UI Automation for this case, I like to have something that works as a background processor that runs every few mins, typical script automation, but this time with VPN automation
So what are your suggestions?
Show me what you got!4 -
So my desktop suddenly stops working in the middle of a Skype call. I open up the cabinet and the processor fan wire has been plugged out with subtlety. No visible damage of short circuits. What sorcery is this?1
-
University course on Computer Systems.
I really really suck at bit manipulation and processor architecture, I tried learning it multiple times but always lost focus every time, was really difficult to stay invested and barely passed that course4 -
Whether baked or no-bake, a strawberry cheesecake is a showstopper that combines the creamy richness of the cheesecake with the sweet and slightly tangy essence of strawberries. It’s a classic dessert choice for celebrations, springtime gatherings, or any occasion where the irresistible combination of cream cheese and fresh strawberries is sure to be a crowd-pleaser.
No-Bake Strawberry Cheesecake Recipe:
Here’s a simple recipe for a no-bake strawberry cheesecake:
Ingredients For Strawberry Cheesecake:
For the Crust:
1 1/2 cups graham cracker crumbs
1/3 cup melted butter
2 tablespoons granulated sugar
Cheesecake Filling:
16 oz (450g) cream cheese, softened
1 cup powdered sugar
1 teaspoon vanilla extract
1 1/2 cups fresh strawberries, hulled and diced
2 tablespoons lemon juice
Strawberry Topping:
1 cup fresh strawberries, hulled and sliced
1/4 cup strawberry jam or preserves
Instructions For Strawberry Cheesecake:
Prepare the Crust:
In a bowl, combine graham cracker crumbs, melted butter, and granulated sugar. Mix until the crumbs are evenly coated.
Press the mixture into the bottom of a 9-inch springform pan to form an even crust. Place it in the refrigerator while you prepare the filling.
Make the Cheesecake Filling:
In a large bowl, beat the softened cream cheese until smooth.
Add powdered sugar and vanilla extract, and continue to beat until well combined.
In a blender or food processor, puree the diced strawberries with lemon juice until smooth.
Fold the strawberry puree into the cream cheese mixture until evenly incorporated.
Assemble the Cheesecake:
Pour the strawberry cream cheese filling over the chilled crust in the springform pan.
Smooth the top with a spatula and refrigerate for at least 4-6 hours, or preferably overnight, to allow the cheesecake to set.
Prepare the Strawberry Topping:
In a small saucepan, heat strawberry jam or preserves over low heat until it becomes smooth and liquid.
Allow the jam to cool slightly before spreading it over the top of the chilled cheesecake.
Arrange sliced strawberries on top for decoration.
Serve:
Carefully remove the cheesecake from the springform pan before serving. Slice and enjoy! This no-bake strawberry cheesecake is a refreshing and delightful dessert that’s perfect for warm days or when you want a fuss-free, delicious treat.2 -
Is it worth to have an intel i9 yet ? Or softwares doesnt know yet how to take advantage from it ?8
-
old one stuff...
Bought lappy with AMD processor
getting hot a lot and fan sound too much
after checking got to know about #PowerNow option in Bios
.
.
turned it off now and finally
No more heating no more fan speed
Looks like it was meant to be disabled but enabled unknowingly...
Hahhah
...1 -
Need some help here!!
I'm learning machine learning, so planning to buy Asus R510JX-DM230T. Are the below specs enough to practice TensorFlow ?? Specs : 2.6GHz Core i7 4720HQ processor
8GB DDR3 RAM, 1TB 5400rpm Serial ATA hard drive
15.6-inch FHD Anti-Glare Display, 2GB Nvidia GeForce GTX 950M Graphics13 -
Here lads quick question, would a 0.3GHz processor speed bump really make that much of a real world difference in a skylake i7 processor?
-
Hi everyone, I'm very confused, I have the opportunity to buy a MacBook pro 13'' 2016 with 8gb of RAM or a MacBook pro 2015 15'' with 16gb of RAM, I really need some advice cause I can't decide by Processor (2016) or RAM (2015), I work as a Ios/Web developer12