As a engineer - been given all sorts of beans to set up with limited time to do this I dont think it’s going to help.

However maybe if I was to take some away and have a more indepth trial - I should hit the spot maybe?

Then when given that particular setup again I have a good start?

I can see the usefulness for roasters and users who have the same beans for a while but doesnt seem a one hit wonder.

Could you from the 1st test results get it ’right" on the 2nd of at least very close?

  • MWJB replied to this.

    NewBoyUK When you say ‘set up’, do you mean adjust grind for a known brew method/ratio/system? (E.g. 1:2, or other known/typical ratio espresso shots, or filter brews?

    You would be able to nominally dial in to a target %EY (maybe 19% +/-2% for espresso, or 20% +/-2% for filter), within a handful of grind adjustments. This would not, in itself, guarantee a great tasting result, but it would show there is no obvious objective, mechanical impediment in the owner/user then fine tuning to their preference point.

    The target range is typically 18-22%EY, but different roasts and origins extract more/less than each other, Brazils, Costa Ricans & Guatemala may taste best at the lower end, Kenya, Rwanda, Colombia at the higher, you can still get under extracted coffee at 19% and over-extracted at 21%, in terms of sensory perception depending on the bean. A 20% extraction, for example, would be an average target over a sample, not the best target in a ‘one size fits all’ scenario.

    I don’t really see the use for roasters specifically, they tend to brew/QC by cupping & rarely measure the dose (if SCA protocol), nor the water weight, frequently under-extracting coffee. You can under-develop roasts (in the sensory aspect) and still achieve an expected EY, To roast coffee so badly that it doesn’t normally extract in a filter brew would show a catastrophic failure & be pretty tragic (nevertheless it occasionally happens).

    In short, if the machine & grinder are known good models, without any obvious/detectable malfunctions and the coffee is ball-park roasted for the brew method (e.g. very light/filter roasts may need to be brewed at longer espresso ratios to satisfactorily extract), you’re probably not going to see any evidence of a problem that isn’t already presenting a clue elsewhere, like way too coarse/fine a grind, someone trying bizarrely short espresso ratios etc.

    If you always took a bean with you, that was consistently available and you were used to it, you could use this to dial in and say, ’I tested with my usual beans, dialled in & all is working well. ’If the customer says that they don’t like their beans with the set up you could tell in a brew whether there was an obvious extraction related reason why, if not then tell them to buy different beans :-) Beans are an ingredient, we buy different ones to explore different tastes, sometimes we just don’t like them when brewed normally. You can’t fix every bag by tweaking extraction. You are dealing with objective mechanical devices, you can use a refractometer to check their objective function, You can’t make everything taste great, in the same way you can’t fix someone’s CD player to make Tom Waits sound like Elvis.

    Basically I get thrown any old beans at me from £8/kg to £30/500g.

    I deal with 2 different machines and generally 4 different grinders. Its what bean gets chucked in thats the hard part.

    I have to do roughtly 1:3 to 1:4 for unknown (cheap) beans. If they are actually labeled and have contact details on the bags (more common blends) I usually call them and generally they are 1:2 to 1:3.

    Some are awful and spit out lol

    Just thinking/hoping theres a easier way to get the best out of whats given

    • MWJB replied to this.

      NewBoyUK Its what bean gets chucked in thats the hard part.

      Subjectively, in terms of expectation (which may not be realistic) I can see this being hard. Objectively, not so much, if the output is achieved in normal time & EY

      NewBoyUK I have to do roughtly 1:3 to 1:4 for unknown (cheap) beans. If they are actually labeled and have contact details on the bags (more common blends) I usually call them and generally they are 1:2 to 1:3.

      I don’t see why cheap beans need a longer ratio, unless it is just to mute offensive flavours/intensity. Cheap, overly roasted beans tend to extract easily enough. Yes, many roasters suggest 1:2 to 1:3 ratios, but the bean can’t do it all by itself, some depends on whether the equipment will allow those beans to extract in those ratios. This you will be able to establish with repeated use of the 4 grinders.

      You’re an engineer for the machines & grinders, not a fairy godmother for the roasting community. :-)

      MWJB The DiFluid is absurdly cheap for the spec it quotes, but I’ve not tried it and have too many refractometers that I don’t/can’t use, already.

      I wouldn’t say, “I lied.”, but blame unrestrained curiosity, late night wine drinking & ‘one click’ online purchasing…DiFluid R2 arrived today. I’ll post some thoughts over the coming week…immediately, I don’t like the spoon idea - 3ml pipettes recommended.

      I await your 👍👎

      I believe you can share results with people so may ask for a few…… if you dont mind

        NewBoyUK I’ll post my findings here, they will be for filter brews, If satisfactory for that, espresso won’t be an issue assuming all samples are syringe filtered.

        • LMSC replied to this.

          MWJB I’ll post my findings here, they will be for filter brews,

          Awesome mate! We are looking forward. :-)

          • MWJB replied to this.

            LMSC I’ve done 3 sets of 10 readings (V60 paper filtered drip, no syringe filtering of samples) with the VST LAB II and the DiFluid R2, Both zeroed at the same time & readings taken one device after the other in equivalent timeframe. I followed the VST protocol for the VST and the DiFluid protocol from their videos (the manual is not specific in this regard).

            Test 1:

            DiFluid average TDS after 10 reads 1.41, stdev. 0.007 (0.016 at 95% confidence).

            VST Lab II average TDS after 10 reads 1.40, stdev. 0.006 (0.013 at 95% confidence).

            Test 2:

            DiFluid average TDS after 10 reads 1.42, stdev. 0.007 (0.016 at 95% confidence).

            VST Lab II average TDS after 10 reads 1.38, stdev. 0.004 (0.010 at 95% confidence).

            So far, so good, the difference in span between readings didn’t exceed 0.06%TDS, as both devices claim +/-0.03% accuracy, this seems fine…

            Test 3 and the wheels seem to come off a little…

            DiFluid average TDS after 10 reads 1.48, stdev. 0.008 (0.018 at 95% confidence).

            VST Lab II average TDS after 10 reads 1.40, stdev. 0.007 (0.016 at 95% confidence).

            So the variation in readings between devices is not a constant interval, As both are quoted as accurate to +/-0.03%TDS accuracy something seems adrift here as the difference in readings differed by up to 0.09%TDS (I, and no influencers that I am aware of, have the facility to check refractometer accuracy.) , However, the precision of the readings for both seem acceptable.

            I then wondered whether the difference in protocol was causing the differences in the averages, so I repeated Test 3, using the VST protocol for both devices. So rather than plonking the hot sample onto the DiFluid lens and taking readings, I cooled the sample in an espresso cup then placed it on the lens.

            Test #3 repeated with VST protocol for both devices:

            DiFluid average TDS after 10 reads 1.44, stdev. 0.031 (0.069 at 95% confidence).

            VST Lab II average TDS after 10 reads 1.42, stdev. 0.005 (0.012 at 95% confidence).

            The DiFluid readings started at 1.50%TDS, dropping to 1.41, the VST only drifted between 1.41 to 1.42. But the DiFluid settled to 1.41. 1.41, 1.41 for readings 8, 9 & 10 (vs 1.41, 1.41 & 1.41 for the Lab II).

            So I feel I have to start again, it seems logical that the plastic casing of the DiFluid may not be allowing samples to reach a steady state as quickly, compared to the steel peltier dish on the VST.

            The DiFluid spoon (volume 0.65g) is a bit daft/messy. Printed instructions are scant, YT videos don’t offer any further info on taking readings,

            I’m a bit disappointed that I’m having to explore a previously unmentioned test protocol seeng that this has been in the field for some months, but I’ll see what occurs over the next few days.

            Data is here:

            VST Lab II vs DiFuid R2

            So for the difluid R2 i use the provided spoon to take the sample for filter and then put it on a tablespoon to let it cool down. The R2 definitely struggles to cool down the sample compared to the VST, temperature difference between zero and sample has a big effect on TDS (data from Jonathan gagne) with room temperature being 75f in the graph below

            Following the pocket science workflow gives the most consistent results and it is still usable for the VST to make sure the temperature is stable (https://pocketsciencecoffee.com/2022/12/07/my-current-refractometry-workflow/)

            Coffee Roaster. Home: Sage Dual Boiler, Niche Zero, Ode v2 (SSP), 1zpresso ZP6 Work: Eagle One Prima EXP, mahlkonig e80s, Mazzer Philos and lots more

            • MWJB replied to this.

              InfamousTuba If this is DiFluid’s test protocol, they should say so somewhere?

              You’ll see from my tests there was no issue with the VST precision using their protocol. I’m not sure why Gagne would be measuring samples at such high temperatures?

              @MWJB It isn’t difluids protocol, they aren’t the best at providing good protocols and information. So it means using a different protocol which is a bit annoying instead of being able to use the same one.

              Gagnes temperature starts at his room temperature of 75f (23.9c) and goes up to 90f (32.2c) which is warm but not that hot

              Coffee Roaster. Home: Sage Dual Boiler, Niche Zero, Ode v2 (SSP), 1zpresso ZP6 Work: Eagle One Prima EXP, mahlkonig e80s, Mazzer Philos and lots more

              • MWJB replied to this.

                InfamousTuba Sorry, the chart said, “Sample temperature”, rather an ambient temperature.

                InfamousTuba It is sample (not room) temperature according to the article, but no one would do this in real life (after reading VST’s instructions)..

                No one would use those instructions, they are from quite a few years ago. It does say room temperature was 75.7f just under the figure that I copied here

                Coffee Roaster. Home: Sage Dual Boiler, Niche Zero, Ode v2 (SSP), 1zpresso ZP6 Work: Eagle One Prima EXP, mahlkonig e80s, Mazzer Philos and lots more

                • MWJB replied to this.

                  InfamousTuba The instructions were written by the guy who innovated the method by which a refractomer is used to read coffee TDS, produced the first coffee refractometers & set the precision & accuracy specs. They still work.

                  @MWJB That is the technique I follow with the VST but the difluid (and the atago) don’t have the same temperature correction that the VST does, VST can account automatically for temperatures between 15-40c. That is the difference between a <£200 device and a £700 device, you do get some better features and they might be worth it for some users

                  Coffee Roaster. Home: Sage Dual Boiler, Niche Zero, Ode v2 (SSP), 1zpresso ZP6 Work: Eagle One Prima EXP, mahlkonig e80s, Mazzer Philos and lots more

                  Thx folks for the insight.

                  Three more tests now added, same protocol for both devices.

                  Stir coffee sample and place a teaspoonful into an espresso cup/shot glass to cool. For all devices this sample will need syringe filtering if espresso, French press, cupping, Aeropress, or pour over made with very fine grinds (e.g. any brews with significant suspended solids, or for absolute best results for all methods).

                  Calibrate with distilled water left on the lens for 1;00. Clean lens with lint free cloth/science wipe.

                  Add sample to refractometer (3 drops to VST, 6 drops to DiFluid R2), wait 1:00. Now would be a good time to start tasting your brew, if cool enough,

                  Start readings, I took 10 from each.

                  The averaged difference in TDS between the devices was +0.03%TDS higher with the DiFluid. Which is reasonable as both claim +/-0.03%TDS accuracy..

                  The stdevs in readings were:

                  DiFluid R2: 0.007, 0.010, 0.005 - 0.007 averaged and 0.016%TDS extrapolated to 95% confidence level.

                  VST Lab II: 0.007, 0.070, 0.070 - 0.007 averaged and 0.016%TDS extrapolated to 95% confidence level.

                  So, as far as this test can tell they are equivalent with this protocol. In terms of workflow, the VST Lab is quicker to stabilise and reached the average reading in 3-5 reads (I wouldn’t be too concerned about whether the reading recorded was 0.01%TDS out). The Lab II I used has been replaced by VST for the more accurate & precise Lab III.

                  This makes the DiFluid R2 quite a step up from the Atago & Amtast offerings in terms of performance. The workflow isn’t as smooth as the VST Lab series but I guess at the price difference, that’s a fair trade off. The Atago workflow is much slower due to the rolling display.