Rapier Posted June 18, 2015 Share Posted June 18, 2015 I have an Intel Core i5 notebook with an integrated Intel HD Graphics card and a NVIDIA GEForce GT 720M. It's onboard, if it's somehow relevant. For some reason, most games can only notice my Intel HD Graphics, and I am forced to use that instead of my other one. How can I make them recognize my NVIDIA instead? There are games like Skyrim who do, and others like Dragon Age, Divinity: Original Sin and such that don't. I've tried going to my NVIDIA's control panel to add these games manually, then choose to always use it over the Intel HD Graphics, but for some reason it doesn't work. Quote Link to comment Share on other sites More sharing options...
Celice Posted June 18, 2015 Share Posted June 18, 2015 You could probably do it at BIOS level and set your mobile GPU as the primary display driver. Internet also has suggestions, http://superuser.com/questions/330568/how-can-i-determine-and-set-my-primary-graphics-card Quote Link to comment Share on other sites More sharing options...
Rapier Posted June 18, 2015 Author Share Posted June 18, 2015 Setting it manually in the NVIDIA panel did not work, so I'll have to set it on BIOS... I might try that, then. But I wish I could just be able to switch when I run games, instead of always using the NVIDIA card. Quote Link to comment Share on other sites More sharing options...
Tessie Spoon Posted June 19, 2015 Share Posted June 19, 2015 I've had similar issues, and one possibility I've come across is that your other games DO use your nvidia card, but they just think they're using your intel card instead. That is, assuming you've set those games in the control panel to use your nvidia card. Which games are your card having trouble with? I might be able to help more with that info. Quote Link to comment Share on other sites More sharing options...
Rapier Posted July 7, 2015 Author Share Posted July 7, 2015 (edited) Dragon Age, for one. Divinity: Original Sin too doesn't seem to recognize my graphics card, and my notebook heats a lot when I set it to a higher graphics setting, so I doubt they are recognizing my NVIDIA card but showing my Intel card. I've set both of them to run the NVIDIA card on the pannel, but it doesn't work. If I set my notebook to only use my NVIDIA, will it screw my battery or will it just increase my energy tax bill by the end of the month? Edited July 7, 2015 by Rapier Quote Link to comment Share on other sites More sharing options...
Skynstein Posted July 10, 2015 Share Posted July 10, 2015 Dragon Age, for one. Divinity: Original Sin too doesn't seem to recognize my graphics card, and my notebook heats a lot when I set it to a higher graphics setting, so I doubt they are recognizing my NVIDIA card but showing my Intel card. I've set both of them to run the NVIDIA card on the pannel, but it doesn't work. If I set my notebook to only use my NVIDIA, will it screw my battery or will it just increase my energy tax bill by the end of the month? If the game is old it could be something at game level. The additional Nvidia GPU consumes more energy, which is why it has that intelligent management mode: Intel HD for Windows use and Nvidia for resource-intensive tasks. Quote Link to comment Share on other sites More sharing options...
I'm gay dabadeedabada Posted July 12, 2015 Share Posted July 12, 2015 Using the Nvidia GPU all the time doesn't actually increase battery load that much, since most of the time it's not really doing anything anyway. If your laptop doesn't already have only an hour of battery life left, it wouldn't hurt to switch it to only-Nvidia. I dunno about you, but I just set the preferred graphics card to my Nvidia one instead of "Auto" and it worked fine with everything I use so??? Quote Link to comment Share on other sites More sharing options...
Rapier Posted July 15, 2015 Author Share Posted July 15, 2015 (edited) Yes, so far: 1) I've set it as my main graphics card in the NVIDIA control painel 2) I've set in the specific options for a specific game to use the NVIDIA graphics card And it doesn't work for some games (Sonic Generations, Sonic Adventure 2 Battle, Divinity: Original Sin, Dragon Age and The Witcher 2 are the examples I remember right now). I've tried to set it to my main in the BIOS, but I didn't see any option that remotely hinted about my graphics card. Edited July 15, 2015 by Rapier Quote Link to comment Share on other sites More sharing options...
Skynstein Posted July 16, 2015 Share Posted July 16, 2015 I've tried to set it to my main in the BIOS, but I didn't see any option that remotely hinted about my graphics card. The BIOS he's referring to is probably the GPU BIOS, not the motherboard BIOS. You probably wouldn't be able to do anything at motherboard level anyway since mobo BIOSes in laptops are heavily castrated. Whenever modding a GPU (to overclock for example), you work directly with the GPU. I've never really needed to access the GPU BIOS, actually. I once did that with an AMD GPU since its auto fan control was faulty but I looked at the BIOS and it seemed alright, so I just kept using MSI Afterburner to add a custom fan control profile and override the GPU control. Quote Link to comment Share on other sites More sharing options...
Rapier Posted July 17, 2015 Author Share Posted July 17, 2015 I only know of one BIOS, the one I press F2 in the start screen and mess around. Isn't that the one? Quote Link to comment Share on other sites More sharing options...
Skynstein Posted July 17, 2015 Share Posted July 17, 2015 (edited) I only know of one BIOS, the one I press F2 in the start screen and mess around. Isn't that the one? http://www.tomshardware.com/faq/id-2288384/upgrade-gpu-bios.html The startup one is the motherboard BIOS, that won't help because you can't change anything in the GPU there. However, I think people at computer forums will help you better. :P Edited July 17, 2015 by Cerberus87 Quote Link to comment Share on other sites More sharing options...
Rapier Posted July 17, 2015 Author Share Posted July 17, 2015 (edited) Looked at your link, but it doesn't seem to have an upgrade for the GPU I have (Nvidia GEForce 720M). The card model doesn't even show in the Card Model bar, even if I set the manufacturer to "all" (or at least I think it doesn't. Could be that computer geeks use a simplified term for my GPU model) Edited July 17, 2015 by Rapier Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.