Stage Select

The PC Gaming Thread, no console peasants allowed


DarkSakul

Recommended Posts

On 9/5/2022 at 12:04 PM, Psychoblue said:

Has this ever happened to anyone?  I've been a big fan of Game Bar for as long as I've been a PC gamer and I've never seen this happen before EVER so I never bothered to check until it was too late.  

Based on the video you uploaded to Twitter, that's not 1920x1008 either.

 

1920x1008 means you're missing 72 vertical pixels.  You'll lose some of the top/bottom of the image or get some vertical squashing, but you should still have the full width.  Should still be perfectly watchable.

 

In your Twitter video you're missing huge parts of the image, both horizontal AND vertical.  It's not just capturing at the wrong resolution -- it looks like it cropped out most of the picture.

 

I don't have much experience with Game Bar, but OBS is pretty much the standard for doing screen capture these days.  Going forward I'd just use that -- it's way more configurable and should work much better.  You're probably already using it to stream, so you might as well just do offline recording with it too.

Link to comment
2 hours ago, misterBee said:

Based on the video you uploaded to Twitter, that's not 1920x1008 either.

 

1920x1008 means you're missing 72 vertical pixels.  You'll lose some of the top/bottom of the image or get some vertical squashing, but you should still have the full width.  Should still be perfectly watchable.

 

In your Twitter video you're missing huge parts of the image, both horizontal AND vertical.  It's not just capturing at the wrong resolution -- it looks like it cropped out most of the picture.

 

I don't have much experience with Game Bar, but OBS is pretty much the standard for doing screen capture these days.  Going forward I'd just use that -- it's way more configurable and should work much better.  You're probably already using it to stream, so you might as well just do offline recording with it too.

That's what I normally do but I had a brain fart where I forgot to just capture full screen with it instead of capturing the game window which is inconsistent.

 

Oh well.

Link to comment
On 9/16/2022 at 5:04 PM, Hawkingbird said:

I brought a EVGA 3080 in July. Fucking hell

Probably for the best, despite video cards being 90% of their business, the graphics cards being a minor portion of their profits. Evga makes so little on each card.

Apparently their big money makers are mother boards and power supplies.

 

 

Link to comment
1 hour ago, DarkSakul said:

Probably for the best, despite video cards being 90% of their business, the graphics cards being a minor portion of their profits. Evga makes so little on each card.

Apparently their big money makers are mother boards and power supplies.

 

 

20% of their revenue are power supplies. The reminder is the rest of their product stack. My last two power have been them so they are top. They start producing keyboards like a year or two ago. I question how good those are as I always see them heavily discounted

Link to comment
1 hour ago, Hawkingbird said:

20% of their revenue are power supplies. The reminder is the rest of their product stack. My last two power have been them so they are top. They start producing keyboards like a year or two ago. I question how good those are as I always see them heavily discounted

Word is their profit margins on their graphics cards are slim, thanks to Nvidia, and Nvidia cutting them under by selling Founder Edition cards. 

Worst is that EVGA and other board partners gets drivers about the same time Youtube reveiwers do, and the board makers has to spend months designing the card. 
Nvidia leaves their partners in the dark so for so much. And Apparently EGVA had enough. 

Link to comment

It's actually worse than I thought. I saw commenter saying the 4080 16GB is a rebranded 4070 and the 4080 12GB is a rebranded 4060ti. If you check the percentage of CUDA cores for a 3070 vs a 3090 and a 3060ti vs a 3090 and then compare them to the 4080 12GB and 4080 16GB vs the 4090 the dude isn't be BSing. WTF.

 

4090 - 16384 Cuda Cores

4080 (16GB) - 9728 Cuda Cores (58.38%)

4080 (12GB) - 7680 Cuda Cores (46.88%)

 

3090 - 10496 Cuda Cores

3070 - 5888 Cuda Cores (56.1%)

3060ti - 4864 Cuda Cores (46.34%)

Link to comment
2 minutes ago, Hawkingbird said:

These cards ain't even out yet and it might end up being a bust like the 20 series if all of this is accurate 

Honestly, based on previous history, I have zero confidence in the consumer. Even when AMD had better cards than Nvidia, they handily out sold. I hope I'm proven wrong and people finally hold them accountable for this shit. This really lends a lot of credence to what EVGA was saying.

Link to comment
2 minutes ago, Darc_Requiem said:

Honestly, based on previous history, I have zero confidence in the consumer. Even when AMD had better cards than Nvidia, they handily out sold. I hope I'm proven wrong and people finally hold them accountable for this shit. This really lends a lot of credence to what EVGA was saying.

AMD GPUs have a long history of having shitty drivers. They haven't done much to fix that reputation. RDNA 3 might be the way to go at this rate.

Link to comment
2 minutes ago, Hawkingbird said:

AMD GPUs have a long history of having shitty drivers. They haven't done much to fix that reputation. RDNA 3 might be the way to go at this rate.

I just have been lucky in that regard I guess. I haven't had any major driver issues on my AMD cards in decades. I'd have to go back to my old Sapphire ATI 9600 and even they were minor and were comparable to my issues with the GTX 880M on my previous laptop.  When I think of shitty drivers I think the issues Intel is having with ARC. 

Link to comment
34 minutes ago, Darc_Requiem said:

I just have been lucky in that regard I guess. I haven't had any major driver issues on my AMD cards in decades. I'd have to go back to my old Sapphire ATI 9600 and even they were minor and were comparable to my issues with the GTX 880M on my previous laptop.  When I think of shitty drivers I think the issues Intel is having with ARC. 

I never had a AMD card so I can't speak of their drivers personally. My nephew is rocking a RX 580 on his PC and there's been no issues as far as I know. 

Link to comment
1 hour ago, Hawkingbird said:

I never had a AMD card so I can't speak of their drivers personally. My nephew is rocking a RX 580 on his PC and there's been no issues as far as I know. 

I've had basically an even split, The only non AMD/ATI or Nvidia card I had was my first video card. Let me see if I can remember all my cards.

 

Matrox G200

Nvidia Geforce Ti 4600 (died and fried my whole system 😑)

ATI 9600

Nvidia Geforce 9700M

ATI 5870M

Nvidia Geforce GTX 880M

Nvidia Geforce GTX 1080

AMD 5700XT

AMD 6900XT

 

Yeah it's exact even split if you toss out the Matrox. Four of each brand. The only bad card I had was the Ti 4600. Other than that, I've been fortunate on the GPU front.

Link to comment
37 minutes ago, Darc_Requiem said:

I've had basically an even split, The only non AMD/ATI or Nvidia card I had was my first video card. Let me see if I can remember all my cards.

 

Matrox G200

Nvidia Geforce Ti 4600 (died and fried my whole system 😑)

ATI 9600

Nvidia Geforce 9700M

ATI 5870M

Nvidia Geforce GTX 880M

Nvidia Geforce GTX 1080

AMD 5700XT

AMD 6900XT

 

Yeah it's exact even split if you toss out the Matrox. Four of each brand. The only bad card I had was the Ti 4600. Other than that, I've been fortunate on the GPU front.

My GPU doesn't run that deep. All my GPUs have been Nvidia. My very first card was gifted to me by a friend after he upgraded his system. My cards were

 

GT 8800

GTX 680

GTX 1080 ti

RTX 3080

 

I might make my next one an AMD if Nivida continues with the bullshit

Link to comment
5 hours ago, Darc_Requiem said:

I just have been lucky in that regard I guess. I haven't had any major driver issues on my AMD cards in decades. I'd have to go back to my old Sapphire ATI 9600 and even they were minor and were comparable to my issues with the GTX 880M on my previous laptop.  When I think of shitty drivers I think the issues Intel is having with ARC. 

I in the past had more issues with Nvidia drivers than ATI.

And ATI in the past even allowed 3rd party drivers before AMD took over. 

No to mention Nvidia drivers are terrible on Linux distros. 

Link to comment
7 hours ago, DarkSakul said:

I in the past had more issues with Nvidia drivers than ATI.

And ATI in the past even allowed 3rd party drivers before AMD took over. 

No to mention Nvidia drivers are terrible on Linux distros. 

Yeah my GTX 880M was a pain in the ass. I'd have to rollback to the previous drivers on that thing often. Thinking back on it, it was worse than my 9600.

Link to comment

I have noticed some minor annoyances since upgrading my GTX1080 to a 6700XT.

 

There were some in-game stutters that were a pain to fix, and aggressive power-saving/GPU down-clocking makes it so that dragging UI windows around when out of game can occasionally be sluggish.  Some driver features also seem a bit buggy.  Freesync can be hit or miss and turning on integer scaling messes up framerates pretty badly for some reason.  AMD drivers appear to support all the same features as Nvidia,  but when you try to actually use them they don't always work as well.

 

There's no doubt the overall user experience is much more polished with Nvidia.  I don't really regret switching though.  The price/performance ratio and raw rasterization power are good enough for me at the moment.

Link to comment
2 hours ago, Hawkingbird said:

Linus is the only reviewer I've seen so far that had anything negative to about the new chips. Mainly they can be outperformed by the 5000 series 3D CPUs. 

Honestly, I don't watch a lot of Linus. He comes off a pretty surface level and has pretty meh takes at times. In contrast the two Steve's (HUB and GN) are incredibly thorough. Steve at Hardware Unboxed did a detail cost per frame analysis. That was going above and beyond. He didn't just do it based on CPU cost. He did CPU Cost, CPU + RAM cost, CPU +Mobo+RAM, it really gives people of picture of what pros and cons would be no matter what situation they are in. It boggles my  mind that Hardware Unboxed hasn't hit 1 million subs yet.

 

From what I've seen, if someone wanted to get a 7600X, they'd be better of waiting for B series motherboards. It's a good performer it basically giving 12900K level gaming performance with 6 cores. The issue is that X670 motherboards are ludicrously expensive. DDR5, while more expensive than DDR4, has come down significantly in cost. Given how quickly DDR5's price is dropping it'll probably be at DDR4's level of cost 1st Quarter next year.

 

Anybody looking at 7950X is probably a halo shopper that, hopefully, isn't just gaming. The productivity performance of the 7950 is insane. The gaming performance is more or less the same as the 7600X because games aren't leveraging all 16 cores. That said, the 5950's efficiency is impressive. Both the 12900K and 7950X are pulling 240+ watts. EVGA was wise to focus power supplies. We got CPUs pulling GPU levels of power. I can't imagine how bad the 13 gen Intel CPUs will be. They are increasing the P core clocks and added E Cores. The 13900K is probably going to be 300+ watt CPU.

 

Edited by Darc_Requiem
Link to comment
2 minutes ago, Darc_Requiem said:

Honestly, I don't watch a lot of Linus. He comes off a pretty surface level and has pretty meh takes at times. In contrast the two Steve's (HUB and GN) are incredibly thorough. Steve at Hardward Unboxed did a detail cost per frame analysis. That was going above and beyond. He didn't just do it based on CPU cost. He did CPU Cost, CPU + RAM cost, CPU +Mobo+RAM, it really gives people of picture of what pros and cons would be no matter what situation they are in. It boggles my  mind that Hardward Unboxed hasn't hit 1 millon subs yet.

Linus knows his lane. He is perfectly capable of providing the the type of coverage the two Steve's do but chooses not to. Being the casual/gateway tech channel is his role and is happy to fill it. I think GN's  Steve has spoken about this as some of their viewers started with Linus and discovered them when they desired more in depth knowledge.

 

I still like watching Linus as he's entertaining and might catch something the others have missed. The others didn't test the 7000 series with the 3D chips like he did.

Link to comment
10 minutes ago, Darc_Requiem said:

Motherboard manufacturers are milking early adopters. They know the reasonably priced boards hit next month but they also know there is a segment of the market that "has to have it right now." 

Waiting for next month is what I'm doing. I will have to anyway as I want an ITX board and it seems there's only one that will be available at launch.

 

I see very little with these new boards to justify these prices. Not like they get scalped to hell like a GPU or ram.

Link to comment
6 minutes ago, Hawkingbird said:

Waiting for next month is what I'm doing. I will have to anyway as I want an ITX board and it seems there's only one that will be available at launch.

 

I see very little with these new boards to justify these prices. Not like they get scalped to hell like a GPU or ram.

There is no justification for it. Last gen boards, due to the inclusion of PCIE Gen 4, were legit more expensive to manufacturer than the previous gen. Now they are just trying to see what they can get away with IMO.

Edited by Darc_Requiem
Link to comment
2 hours ago, Hawkingbird said:

Rare instance of Steve favoring a Ryzen 7 at launch

 

 

This is the AMD chip I have my eye on. All I need to do now is wait for the B650 boards. 

Yeah it caught me off guard too. Based on Zen 4 works, the 7700X's average clocks are higher than the 7900X or 7950X. So that gives it amazing gaming performance and since the budget AM5 boards aren't out yet. It doesn't make sense to buy 7600X. So for purely gaming the 7700X is the best option. Most people aren't going to put a $300 CPU into $300+ motherboard. Things may change once B650 hits. If the prices are reasonable and the boards are solid of course.

Link to comment

 

 

Linus deserves a slap in the mouth for this title
 


Sure they compete with the NVIDIA RTX 3060 and AMD RX 6600, but they have their issues.
1. Not compatible with all chipsets/boards
2. Not compatible with all Intel and AMD CPUs
1&2 Cliff Notes: If your system don't support Rebar, don't get this card
3. Their Legacy support is weird.

Edited by DarkSakul
Link to comment

@DarkSakulI watched the Hardware Unboxed review. I'll link it but the summary was basically if they were only competing with the 3060 ($370), given the cheaper price, it could be viable option. The cost savings, for some, could be worth the additional headache.  However the 6650XT is cheaper and performs better on average. You can get a 6650Xt for $300. The 6600 can be had for $240.There is a reason Intel's slides did a value comparison against the 3060. Hell for an extra $20 you can get 6700XT instead of a 3060. Nvidia's low end GPU prices are a joke. One of their biggest at advantages at the high end is Raytracing performance. At the low end it's non issue Ray Tracing performance just sucks across the board.

 

 

Link to comment
1 minute ago, Darc_Requiem said:

@DarkSakulI watched the Hardware Unboxed review. I'll link it but the summary was basically if they were only competing with the 3060 ($370), given the cheaper price, it could be viable option. The cost savings, for some, could be worth the additional headache.  However the 6650XT is cheaper and performs better on average. You can get a 6650Xt for $300. The 6600 can be had for $240.There is a reason Intel's slides did a value comparison against the 3060. Hell for an extra $20 you can get 6700XT instead of a 3060. Nvidia's low end GPU prices are a joke. One of their biggest at advantages at the high end is Raytracing performance. At the low end it's non issue Ray Tracing performance just sucks across the board.

 

 

I not disparaging Intel going for the Mid tier market, that price bracket is a smart move to break in, and Nvidia needs to be humbled.
What I want is people to be careful about being early adopters and you will be troubleshooting this card alot. This is for the budget enthusiast who don't mind being a Beta Tester. This card isn't for those who only experience with gaming PCs are buying a pre-built and did only minimal or no upgrades, this isn't for the inexperienced.
Maybe if this was already installed in a pre-built, but that's going to be a while before we see Pre-builts with Intel Graphic cards in Best Buy, maybe a systems builder service, but that going to be a while too.

 

Link to comment

@DarkSakulWhat's hurt Intel is that they are launching late. IIRC, the original target date for Alchemist was this time last year. If this level of performance and pricing dropped in 2021, they would have had some leeway. Now crypto has crashed causing GPU pricing to tank. To compound their poor timing, Nvidia and AMD are about to drop their next gen cards. So instead of having the cheapest cards, with mid tier performance, and driver teething issues. They have cards that cost more than better performing Radeon cards, low end performance, and driver teething issues. 

Link to comment
3 hours ago, DarkSakul said:

Linus deserves a slap in the mouth for this title

His argument for this is to keep Intel in the GPU game. Honestly he isn't the only one making this plea. Jay said the same thing though he did mention anyone buying it will be a beta tester. This thing needs to be in the hands of actual people if Arc is going to get any better. 

 

2 hours ago, Darc_Requiem said:

What's hurt Intel is that they are launching late. IIRC, the original target date for Alchemist was this time last year. If this level of performance and pricing dropped in 2021, they would have had some leeway. Now crypto has crashed causing GPU pricing to tank. To compound their poor timing, Nvidia and AMD are about to drop their next gen cards. So instead of having the cheapest cards, with mid tier performance, and driver teething issues. They have cards that cost more than better performing Radeon cards, low end performance, and driver teething issues. 

Another thing that hurts Intel is the poor legacy support. A GPU of this level of performance and price point the biggest appeal would be to play old games that can run on toasters. It's poor performance on Siege, CS:GO and anything that runs on DX9 and DX11 would kill it for the people that meet the rebar requirement. 

Link to comment

Arc card tear down. Oh and its bad. Like Nvidia 2000 series bad.
Steve is about to lose his shit here.

 

4 hours ago, Hawkingbird said:

His argument for this is to keep Intel in the GPU game.

I get the argument, but its based on bad faith. Its a bad beta test, expecting people to be savy with a card that is picky what hardware it works with and can't be user serviced.
Truth is I can't in any sort of good faith recommend anyone to buy this. But this is better than people buying old Nvidia 970 and 770 cards, but not by much.

 

Here Steve even goes into this is a beta test, made for ehtnisist who want to tinker and not for everyone.
They have it on price, but the dollar to quality value isn't there.
 

4 hours ago, Hawkingbird said:

It's poor performance on Siege, CS:GO and anything that runs on DX9 and DX11 would kill it for the people that meet the rebar requirement. 

Which hurts a Mid tier card like this, mid tier market tends to go for older games or games on a budget more than AAA titles.

I can't imagine what Intel was thinking with this. Then again I think their CPU coolers are all shit as well.

 

Edited by DarkSakul
Link to comment
12 minutes ago, DarkSakul said:

I can't imagine what Intel was thinking with this. Then again I think their CPU coolers are all shit as well.

It doesn't matter what Intel was thinking.  This was always gonna be the result for a first gen product.

 

Drivers are very very difficult to get right.  It was always going to be a shitshow at first, and it's clear they're not even going to bother with older titles as much.  Best to just focus on newer APIs and move forward rather than try to optimize decades worth of old games.

 

AMD, a company which supposedly knows how to make graphics cards, has had broken drivers for the past half year.  VIDEO HARDWARE ACCELERATION isn't currently working. That means in 2022 you can't even decode a Youtube video properly on an AMD card.  Shit is wild.

Link to comment
5 minutes ago, misterBee said:

It doesn't matter what Intel was thinking.  This was always gonna be the result for a first gen product.

 

Drivers are very very difficult to get right.  It was always going to be a shitshow at first, and it's clear they're not even going to bother with older titles as much.  Best to just focus on newer APIs and move forward rather than try to optimize decades worth of old games.

 

AMD, a company which supposedly knows how to make graphics cards, has had broken drivers for the past half year.  VIDEO HARDWARE ACCELERATION isn't currently working. That means in 2022 you can't even decode a Youtube video properly on an AMD card.  Shit is wild.

The Intel Arc is going to go the way of the Stadia unless Intel gets their but out of there heads.
Its one thing to have one thing that shit, but the drivers are shit, the construction is shit (tape and glue, too many LEDS), the compatibility is shit and this is a low budget Mid tier card.

They need to find a board partner to step in and make cards for them, preferably with less tape, and glue and heat traps. This this looks like founder edition trash.

Worst of all Intel expect the public to jump on their card in order to keep Nvidia (and lesser extend AMD) honest?

Link to comment
11 minutes ago, DarkSakul said:

The Intel Arc is going to go the way of the Stadia unless Intel gets their but out of there heads.
Its one thing to have one thing that shit, but the drivers are shit, the construction is shit (tape and glue, too many LEDS), the compatibility is shit and this is a low budget Mid tier card.

They need to find a board partner to step in and make cards for them, preferably with less tape, and glue and heat traps. This this looks like founder edition trash.

Worst of all Intel expect the public to jump on their card in order to keep Nvidia (and lesser extend AMD) honest?

 

Board partners ARE making Intel cards:

The A770 is a 3070-class card that is hamstrung by drivers so badly it's basically a 3060.  It's a decent start with a lot of problems.

 

I'm pretty sure Intel expects this first gen to do poorly, and are placing all their bets on Battlemage, which is hopefully when drivers and other stuff will have matured.  We'll have to see what the B series of GPU is like.  If things haven't shaped up by then things might be looking bleak.

Link to comment
8 hours ago, Hawkingbird said:

I don't think Arc will go the way of Stadia as most people want this to succeed as they wanted a third competitor in the market. Hopefully it is successful enough for Intel to stick around. Get this in OEMs if they have to

How many people are willing to get the risk is the question?

Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
  • Create New...
Stage Select