Whole-Drive Fill

This test starts with a freshly-erased drive and fills it with 128kB sequential writes at queue depth 32, recording the write speed for each 1GB segment. This test is not representative of any ordinary client/consumer usage pattern, but it does allow us to observe transitions in the drive's behavior as it fills up. This can allow us to estimate the size of any SLC write cache, and get a sense for how much performance remains on the rare occasions where real-world usage keeps writing data after filling the cache.

The ATSB tests already showed that the TeamGroup L5 LITE 3D doesn't lose much performance when it is full, but actually plotting its performance through the process of filling it up is surprising. The sequential write throughput does drop slightly after about 5GB, but only by 10-15MB/s, and there are no further performance drops for the rest of the fill process. This is a lot more consistent than most drives, and provides more evidence that SLC caches running out aren't a problem for this SSD.

Sustained 128kB Sequential Write (Power Efficiency)
Average Throughput for last 16 GB Overall Average Throughput

 

Working Set Size

When DRAMless SSDs are under consideration, it can be instructive to look at how performance is affected by working set size: how large a portion of the drive is being touched by the test. Drives with full-sized DRAM caches are typically able to maintain about the same random read performance whether reading from a narrow slice of the drive or reading from the whole thing. DRAMless SSDs often show a clear dropoff when the working set size grows too large for the mapping information to be kept in the controller's small on-chip buffers.

As expected, the L5 LITE 3D maintains fairly steady random read performance regardless of working set size. The DRAMless Mushkin Source starts off with significantly lower random read IOPS and declines even more as working set sizes grow to more than a few GB of active data. The three drives here with Phison controllers (one SATA, two NVMe) all show at least some decline in performance with large working set sizes, even though those drives all have the usual 1GB DRAM to 1TB NAND ratio.

Synthetic Benchmarks, Part 1 Mixed Workloads and Power Management
Comments Locked

42 Comments

View All Comments

  • jabber - Saturday, September 21, 2019 - link

    Always be wary of 1 Star tech reviews on Amazon. 60% of them are usually disgruntled "Doesn't work on Mac!" reviews.
  • flyingpants265 - Saturday, September 21, 2019 - link

    "They're a biased sample, as very happy and very unhappy people tend to self-report the most. Which doesn't mean what you state is untrue, but it's not something we can corroborate."

    Ryan, that doesn't explain why one model/brand can have 27% 1-star reviews, and another has 7%.... Unless you think Team Group customers are SEVERAL TIMES MORE outspoken than Crucial/Samsung/whatever customers for some reason. You can't ignore those reports. Ofc the product doesn't have a 27% failure rate, but it's likely much higher than competing products.
  • Korguz - Saturday, September 21, 2019 - link

    ever consider that maybe the bad reviews are either fake, or made up reviews with the person not actually owning, or even bought the product ?
  • flyingpants265 - Monday, September 23, 2019 - link

    ...Then you'd have to explain why ONLY TEAM GROUP SSDs have tons of fake 1-star reviews, and other SSDs don't. Seems Anandtech commenters are not that bright..
  • Korguz - Sunday, September 29, 2019 - link

    maybe one person who bought one, it failed, then to try yo get even, created more then one account ? unless you can PROVE these supposed 1 star reviews are real reviews, then i guess you are not that bright as well... cant really prove your point, so you resort to insults.. grow up
  • Kraszmyl - Friday, September 20, 2019 - link

    I have nearly a thousand of thier drives from 128g to 480g and so far no failures. Also yes cheap products have poor support, that's one of the reasons why they are cheaper.
  • Samus - Saturday, September 21, 2019 - link

    I'd err on the side of caution when dealing with Team Group, though. Failed memory (which I've seen plenty of over the years) is one thing, but failed data storage is a lot more catastrophic. I can't believe I'm saying this but I'd feel safer with an ADATA SSD than a Team Group SSD...and I've seen a number of ADATA's fail, though none recently (in the last few years)
  • bananaforscale - Saturday, September 21, 2019 - link

    It's your responsibility to make backups. Never allow a single point of failure.
  • philehidiot - Saturday, September 21, 2019 - link

    Aye, especially with SSDs where data recovery is harder than with a HDD. Personally, I pop critical data on two local SSDs and then a memory stick, phone or other system. I don't like the cloud as it is at the mercy of the Internet or another company and I've had access issues which have denied me access to data or, weirdly, only given me access to months old versions. So I prefer a dual local backup, so if a drive fails I can just switch to the backup immediately, and also another copy which is not linked to the same system in case of some catastrophic PSU weirdness that takes out other components (happened once a long time ago and with a cheap PSU) or malware attacks. If I was getting a cheap SSD with a reduced warranty, knowing it uses whatever NAND is cheap at the time, I'd not be using that in a critical system without adequate redundancy (RAID, most likely). You pays your money and takes your choice but if you buy cheap, ensure you're protected... And if you buy expensive.... Ditto.
  • evernessince - Saturday, September 21, 2019 - link

    Technically speaking the failure rates should be no higher then other manufacturers, after all they are using the same NAND and controllers as everyone else. That said there is something to be said for poor customer service. I don't know how they are getting that many 1 star reviews though, not unless they are just rebranding B stock.

    Also, you shouldn't trust only one source for reviews and you should always look at who is posting the bad reviews. For example, this guy seems to be the exact same guy who posted on Newegg as well

    https://www.amazon.com/gp/profile/amzn1.account.AF...

    He seems to leave a lot of bad reviews and often times does not do a good job explaining why. Judging by their English usage, I'd also say they are not a native speaker. There are plenty of companies in China they also pay people to go out and write both good and bad reviews for competing products which makes research on reviewers all the more important.

    I'm not saying they don't deserve their rating, I'm just saying you should always check not just the reviews but the reviewers as well. 2 sources minimum as well. It's a PITA but there are so many fake reviews out there (especially on Amazon) that it's required if you want to get what you paid for.

Log in

Don't have an account? Sign up now