• KillingTimeItself@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    ·
    6 hours ago

    i just upgraded this year, to an r9 5900x, from my old r5 2600, still running a 1070 though.

    I do video editing and more generally CPU intensive stuff on the side, as well as a lot of multitasking, so it’s worth the money, in the long run at least.

    I also mostly play minecraft, and factorio, so.

    ryzen 5000 is a great upgrade path for those who don’t want to buy into am5 yet. Very affordable. 7000 is not worth the money, unless you get a good deal, same for 9000, though you could justify it with a new motherboard and ram.

    • ArxCyberwolf@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      3 hours ago

      I’m rocking a 5800X and see no reason to go to 7000 or no 9000 anytime soon. It’s been great since I built the PC.

  • Malfeasant@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    8 hours ago

    I’m still using the i7 I built up back in 2017 or so… Upgraded to SSD some years ago, will be upping the ram to 64gigs (max the mb can handle) in a few days when it arrives…

  • padge@lemmy.zip
    link
    fedilink
    English
    arrow-up
    9
    ·
    12 hours ago

    I’m the one person who people go to for PC part advice, but I actually try to talk them down. Like, do you need more RAM because your experience is negatively impacted by not having enough, or do you just think you should have more just because?

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      8 hours ago

      Ha, I had this exact conversation with a friend of mine a few days ago, he wants to upgrade from 16GB to 32GB and when I asked why, he just blanked out for a while and went “…because more is better, right?”

      He spends most of his time playing rpg maker porn games and raid shadow legend, also really taxing that RTX 3070 he bought right in the middle of the pandemic.

  • BoxOfFeet@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    12 hours ago

    I built a PC in 2011 with an AMD Phenom II. Can’t remember which one, it may have been a 740. And I’m pretty sure a Radeon HD 5450 until FO4 came out in 2015 and I needed a new graphics card. Upgraded to a Radeon R7 240, and some other AM3 socketed CPU I found for like, $40 on eBay. By no means was I high end gaming over here. And it stayed that way until 2020, when I finally gutted the whole thing and started over. It ran everything I wanted to play. So I got like, 9 years out of about $600 in parts. That’s including disc drives, power supply, case, and RAM. And I’m still using the case. I got my money’s worth out of it, for sure. The whole time we were in our apartment, it was hooked up to our dumb TV. So, it was our only source of Netflix, YouTube, DVDs, and Blu-rays. It was running all the time. Then, I gave all the innards to my buddy to make his dad a PC for web browsing. It could still be going in some form, as far as I know.

    • SkyezOpen@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      9 hours ago

      I remember the 5450! I got one when wrath of the lich king dropped because my Dell integrated graphics couldn’t handle strand of the ancients. That baby got me from 2 FPS to 15. Served me until I left for school.

  • RBWells@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    14 hours ago

    I showed this to my penultimate daughter, who coopted my (literal 2014) Dell PC, the only thing I’d ever done to it was add memory, it is a beast still. Said “look, your 4chan twin” and she cracked up. But if she does not steal it when she moves out I will probably be able to get ten more years out of it.

  • UnderpantsWeevil@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    15 hours ago

    I tend to flip my RAM out every 3-5 years and notice a significant improvement in performance. Other than that, though…

  • OR3X@lemm.ee
    link
    fedilink
    arrow-up
    14
    ·
    18 hours ago

    I originally built my current PC back in 2016 and only just “upgraded” it last year. I put upgrade in quotes because it was literally a free motherboard and GPU my buddy no longer needed. I went from a Core i5 6600K to a Ryzen 5 5500GT and a GTX960 4GB to a GTX1070. Still plays all the games I want it to, so I have no desire to upgrade it further right now. I think part of it is I’m still using 1080P 60Hz monitors.

    • lightnsfw@reddthat.com
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      13 hours ago

      I was running one from 2011 up until 2 years ago when I finally hit a wall in a game I was trying to play and had to upgrade the processor (which meant a new motherboard, which meant new everything). Prior to that I had only upgraded the GPU a couple years prior which i really didn’t need but it was a present to myself and I was able to give the old one to my brother. By the time this one is outdated I might not even be interested in computers anymore with the way things are going with technology.

  • merthyr1831@lemmy.ml
    link
    fedilink
    English
    arrow-up
    11
    ·
    edit-2
    19 hours ago

    if you had a top of the line pc in 2014 you’d be talking about a 290x/970/980 which would probably work really well for most games now. For CPU that’d be like a 4th gen intel or AMD Bulldozer which despite its terrible reputation probably runs better nowadays thanks to better multi-threading.

    A lot of the trending tech inflating minimum requirements nowadays are stuff like raytracing (99% of games don’t even need it) and higher FPS/resolution monitors that aren’t that relevant if you’re still pushing 1080p/60. Let’s not even begin with Windows playing forced obsolescence every few years.

    Hell, most games that push the envelope of minimum specs like Indiana Jones are IMO just unoptimised messes built on UE5 than legitimately out of scope of hardware from the last decade. Stuff like Ninite hasn’t delivered in enabling photorealistic asset optimisation but HAS enabled studios to cut back on artist labour in favour of throwing money at marketing.

    • dukatos@lemm.ee
      link
      fedilink
      arrow-up
      4
      ·
      17 hours ago

      That CPU started as a development Linux workstation, then as Windows gaming rig, then served couple of years as unRaid server and now runs a Windows 10 workstation for my mother in law. Still fast enough for everyday use.

    • JasonDJ@lemmy.zip
      link
      fedilink
      arrow-up
      3
      ·
      17 hours ago

      My i7-920 lasted a lot longer than I ever thought it would. I still have it but i don’t need the power anymore since I don’t have time to PC game. Actually it was in a P6T v2 and I think I replaced it with a xeon processor.

      • Yerbouti@sh.itjust.works
        link
        fedilink
        arrow-up
        4
        ·
        17 hours ago

        IDK I have 200+ games and they all work. In terms of AAA I played all the recent Fallout, Doom, Tomb Raider and many others. I even played Hellblade in VR. Definitely good enough for me.

    • merthyr1831@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      19 hours ago

      What sorta stuff do you play? I built an i5 2500k system a couple years back (2020-ish) and it struggled a fair bit, but was on the cusp of 1080p60 in the few games I tested like Fortnite, f1-2019, Warzone etc.

      • Yerbouti@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        18 hours ago

        I just don’t play online games, never have. I can play pretty much any single player/coop game at medium/1080. Maybe most recent titles like Elden ring would struggle, but I have hundreds of games in my library and they all work fine.

        I even made a small VR project with it although every manufacturers said it wouldn’t work. The GPU is a 1060.

        Overall, I’ve spent around 600$ on this computer, over 15 years and it still a perfectly capable PC. I have another PC and Macbook for work, but the i5 has been our streaming/gaming pc for years.

      • Uncut_Lemon@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        18 hours ago

        2500K are good overclockers, ran one for many years at 4.7GHz. It definitely kept my CPU relevant way past it’s supposed life span.

  • PieMePlenty@lemmy.world
    link
    fedilink
    arrow-up
    14
    ·
    23 hours ago

    I thought anon was the normie? The average person doesnt upgrade their PC every two years. The average person buys a PC and replaced it when nothing works anymore. Anon is the normie, they are the enthusiasts. Anon is not hanging with a group of people with matching ideologies.

    • stevedice@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      11 hours ago

      A lot of people have forgotten gaming and talking about gaming on discord is not the norm. However, in 4chanspeak, “normie” just means “not an incel”.

    • phlegmy@sh.itjust.works
      link
      fedilink
      arrow-up
      6
      ·
      17 hours ago

      Yeah it’s pretty normal…
      A lot of people use discord to hang out with their friends.

      Not me though, I have no friends.

  • LazerFX@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    5
    ·
    22 hours ago

    I had an i5-2500k from when they came out (I think 2011? Around that era) until 2020 - overclocked to 4.5Ghz, ran solid the whole time. Upgraded graphics card, drives, memory, etc. but that was incremental as needed. Now on an i7-10700k. The other PC has been sat on the side and may become my daughters or wife’s at some point.

    Get what you need, and incremental upgrades work.

    • Zink@programming.dev
      link
      fedilink
      arrow-up
      2
      ·
      13 hours ago

      I just installed Linux on my old 2500k @ 4.5GHz system a few days ago! I haven’t actually done much with it yet because I also upgraded the OS on a newer system that is taking over server responsibilities. But you are correct on the year with 2011. I built mine to coincide with the original release of Skyrim.

      The install went quickly (Linux Mint, so as expected) and the resulting system is snappy yet full featured. It’s ready for another decade of use. Maybe it will be a starter desktop to start teaching my second grader with it. (Educational stuff as well as trying a mouse for games compared with a controller)

      • LazerFX@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        11 hours ago

        I got screwed over with the motherboard, as it had to go back because of bimetallic contracts in the SATA ports that could wear out and stop it working so there was a big recall of all the boards… Was an amazing system though and if I hadn’t seen the computer I’m currently running for an absolute steal, I’d probably still be running it with a 3060 as a pretty potent machine still.

        Of course, then I’d never have the experience of just HOW FAST NVME IS! :⁠-⁠D

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      17 hours ago

      I was rocking a i7-4790k and a GTX970 until about 2 years ago, now I’m rocking a i5-10400F and one of Nvidia’s chip shortage era RTX2060s. My wife is still on a i5-4560 (by memory) and a RX560 and that’s really getting long in the tooth with only 4 threads and the budget GPU doesn’t help matters much.

      Later this year when Windows 10 gets closer to EOL I figure I’ll refresh her machine and upgrade the SSD in mine

  • bluewing@lemm.ee
    link
    fedilink
    arrow-up
    2
    ·
    19 hours ago

    My $90US AWOW mini with Celeron J4125, 8 gigs of shared memory, 128gig SSD seems to run FreeDoom as good as any of the other potatos them GamerBoi fancy water cooled custom boxes have…