|
Keefer S wrote: If you know of information supporting your viewpoint, I'd love to read about it.
You read the link?
|
|
|
|
|
Interesting read and explains why Rasco's Perfect Disk uses a consolidate free space algorithm by default for SSDs.
|
|
|
|
|
ok...
Note of course the post is 9 years old. So maybe something has changed since then.
Additionally it does not provide any references. Closest is the following
"I dug deeper and talked to developers on the Windows storage team"
The first image gives a screen shot. On my personal computer I can see that the service is not on. Which suggests that to a certain extent, if Microsoft thinks it should be happening, it is not (on my computer.)
The article says this.
"First, yes, your SSD will get intelligently defragmented once a month."
And it also says the following
"Windows 7, along with 8 and 8.1 come with appropriate and intelligent defaults and you don't need to change them for optimal disk performance."
I did not change the default. And as noted it is not on. I am running Windows 10. So perhaps no longer as relevant.
And at least back then, 2014, SSDs had a reliability problem. So maybe that has changed since then.
|
|
|
|
|
Well,it was still around three years ago on windows 10, since they released a bugfix for it.
Microsoft fixes Windows 10 bug causing excessive SSD defragging[^]
If you think about it, it makes sense to defrag also SSDs, just not very often.
If you get a lot of file fragments spread all over the disk, it will cause excessive writes since it will have to spread out the files on more blocks.
Default setting since windows 8.1 is every 28 days.
|
|
|
|
|
I've only read bits and pieces of Scott's article, looking for specific keywords, but (I think) what he fails to mention is that Windows has adapted its defrag approach so it now knows how to tell an SSD apart from a spinner.
I believe there were justified concerns at the time, when SSDs first came out (if I remember my timeline correctly), XP's defragger just treated all drives like any spinner (the only thing it knew about) and blindly tried to run the defrag code that only made sense for traditional drives. It's only later that MS introduced the Trim command to Windows.
|
|
|
|
|
The trim command was introduced with win 7 iirc
|
|
|
|
|
To my knowledge, this happens automatically anyway, but in the form of TRIM, whereby unused blocks are cleaned in some way. But that happens automatically, as far as I know.
In the Samsung FAQs, they also specifically recommend against using any kind of defragmentation as that will cause additional writes, which in turn shortens the lifespan of the SSD. 
|
|
|
|
|
Good question. Other comments favor NO or just TRIM (which is something else and was an issue when, long time ago, you couldn't yet set things up to do this automatically).
Arguments on the hardware side are pretty convincing, but then still:
- hardware: why is reading/writing the same bytes in different sized packages so much slower for small packages?
- what about the OS having to emit many more diskrequests, switching from user mode?
Can't that slow things down?
- and finally, what about measuring?
My impression is that it does make a difference. So, max once a month, when I believe it is useful, I do a full defrag.
There is (in my case) an argument against in differential backup (diskimage): defrag will cost many more backup bites than without defrag, so much so, after a few backups, that a new complete backup is an option.
So, I will lower my defrag frequency even more to say once in 3 months.
Never say never!
|
|
|
|
|
...downhill!
VS consuming huge amount of memory isn't new (even MS decided to ignore it totally)...
But now I have something new... And it confirmed several times...
I have a solution with around 80 projects in it, only a several loaded at any given time... If I reload a project to change something it will not compile until VS closed and re-opened...
Until that time it will report compilation failed without any actual error, but also without the option to run...
"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg
|
|
|
|
|
After the last update of VS2022 my colleague reported that debugging with step over and step into didn't work anymore. It was not clear to me if he was talking about C++ or C# debugging, he also uses other debugging tools that might interfere with VS debugging.
|
|
|
|
|
RickZeeland wrote: debugging with step over and step into didn't work anymore. It was not clear to me if he was talking about C++ or C# debugging
Interesting you'd mention that. I installed the latest update last week, and on Thursday/Friday, on multiple occasions, single-stepping (F10) seemed to continue execution or couldn't recover or something like that. I attributed it to me fat-fingering it, but happened enough times that now I see your post, I'm wondering if there's something to it.
In my case that would be C#.
|
|
|
|
|
wow. Not testing much, are you Microsoft.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
One has to remember that multi-project solutions don't (always) compile if you haven't checked the proper project(s) in the "Build | Configuration Manager" unless you specifically ask to "Build / Rebuild" that project. (Been there)
On the other hand, when VS is "sleeping", it "seems" to release (more) excess memory. I think they're doing a lot of tinkering.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
I have a very precise dependency tree, so compiling the main project will compile everything that is outdated - I also mostly do build-solution...
But the main issue is that, there is no error behind the fail and re-opening VS solves the problem - which indicates that VS does no know how to reload a unloaded project correctly... anymore... (which is fixed by re-opening VS and the solution)...
"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg
|
|
|
|
|
Kornfeld Eliyahu Peter wrote: VS consuming huge amount of memory isn't new
Versus which IDE that uses very little?
Kornfeld Eliyahu Peter wrote: I have a solution with around 80 projects in it
To me that would be an organization problem. I would break it into different solutions and if that was not possible then it would suggest different sort of problem.
|
|
|
|
|
Wordle 897 3/6*
🟩🟨⬛⬛⬛
🟩🟨🟩⬛🟩
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 897 3/6
🟩🟩⬜⬜⬜
🟩🟩⬜🟩🟩
🟩🟩🟩🟩🟩
All green 💚.
|
|
|
|
|
Wordle 897 3/6
⬛⬛🟩⬛⬛
⬛🟨🟩⬛🟨
🟩🟩🟩🟩🟩
|
|
|
|
|
⬜⬜🟩⬜⬜
🟨⬜⬜⬜⬜
🟩🟩🟩🟩🟩
In a closed society where everybody's guilty, the only crime is getting caught. In a world of thieves, the only final sin is stupidity. - Hunter S Thompson - RIP
|
|
|
|
|
Wordle 897 4/6
⬜⬜⬜⬜⬜
⬜⬜⬜⬜⬜
🟨🟨⬜🟩🟨
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 897 3/6
🟩⬜🟨⬜⬜
🟩⬜⬜⬜🟨
🟩🟩🟩🟩🟩
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
Wordle 897 4/6
🟩🟩⬛⬛⬛
🟩🟩⬛⬛🟩
🟩🟩⬛⬛🟩
🟩🟩🟩🟩🟩
Ok, I have had my coffee, so you can all come out now!
|
|
|
|
|
Wordle 897 4/6
⬜⬜🟩🟨⬜
⬜⬜🟩⬜🟩
⬜⬜🟩⬜🟩
🟩🟩🟩🟩🟩
|
|
|
|
|
I recently bought a new bluetooth device and it only uses Bluetooth LE. The Bluetooth adapter on my desktop is too old to connect.
So $16 later, I have an upgraded adapter that supports BT 5.0. My new device connects and works!
But now my older bluetooth devices don't connect...
clarification: They won't pair with the new adapter.
The difficult we do right away...
...the impossible takes slightly longer.
modified 2 days ago.
|
|
|
|
|
That's disturbing to hear.
|
|
|
|