The Naked Scientists
  • Login
  • Register
  • Podcasts
      • The Naked Scientists
      • eLife
      • Naked Genetics
      • Naked Astronomy
      • In short
      • Naked Neuroscience
      • Ask! The Naked Scientists
      • Question of the Week
      • Archive
      • Video
      • SUBSCRIBE to our Podcasts
  • Articles
      • Science News
      • Features
      • Interviews
      • Answers to Science Questions
  • Get Naked
      • Donate
      • Do an Experiment
      • Science Forum
      • Ask a Question
  • About
      • Meet the team
      • Our Sponsors
      • Site Map
      • Contact us

User menu

  • Login
  • Register
  • Home
  • Help
  • Search
  • Tags
  • Member Map
  • Recent Topics
  • Login
  • Register
  1. Naked Science Forum
  2. Profile of SeanB
  3. Show Posts
  4. Thanked Posts
  • Profile Info
    • Summary
    • Show Stats
    • Show Posts
      • Messages
      • Topics
      • Attachments
      • Thanked Posts
      • Posts Thanked By User
    • Show User Topics
      • User Created
      • User Participated In

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

  • Messages
  • Topics
  • Attachments
  • Thanked Posts
  • Posts Thanked By User

Messages - SeanB

Pages: [1]
1
General Science / Re: Is there a safer alternative to electroconvulsive therapy for erasing memory?
« on: 16/11/2017 17:10:03 »
As far as I know memory is not stored as a particular location, or a specific neuron set, but more like a hologram, with the memory being more created by the linkages formed between neurons, and then being kept by being refreshed  and with new memories creating additional pathways.

Thus I see that your method, while being able to erase particular memories, will also have a very unwanted side effect of also erasing or modifying the rest of the memories stored in this region, and thus will prove less than ideal in action. The action of thinking of a bad memory can also have some side effects, as association will also cause additional recollections which might be desired, and the erasure will more than likely destroy or modify them as well.
The following users thanked this post: smart

2
Geology, Palaeontology & Archaeology / Re: Which rocks contain metallic flakes resembling silver and gold?
« on: 26/08/2017 09:19:12 »
Lot of granites do have quite high gold concentrations inside, but are not at all economical to mine because they are so hard, and the energy required to grind them to powder to enable the gold to be extracted is more than the value of the gold. Thus most gold comes from softer dolerites where it is easier to crush them, and we do not really mine seawater even though it has a similar gold content, as concentrating the gold is hard in that case. However a lot of the flakes in rocks are also various forms of mica, which can look like gold and silver flakes.
The following users thanked this post: chris

3
General Science / Re: Why are builders using small bricks?
« on: 06/08/2017 05:04:36 »
A smaller brick is a brick that is cheaper to manufacture, as you have to fire it for a shorter time to vitrify the clay all the way through ( or at least deep enough into the clay to be usable), as well as having a better packing in the kiln used to fire them, along with less wasted brick after firing. Larger bricks require a much longer fire, as you have a fixed temperature in the kiln, and the brick is slow to vitrify, this is a fixed rate that it progresses into the surface, and the smaller brick has a larger surface to volume ratio.

As bricks tend ( at least these days ) to be delivered palletised, as opposed to the old way of bulk bricks being delivered in a tipper truck, they are fired shorter time, and are thus weaker, which is actually an advantage as there is built in crack stopping and expansion in the brick failing before the mortar joins, stopping any cracks from propagating all the way through the wall.

Century old bricks were fired almost to be fully glass, so they would survive handling, but they then are both very likely to be somewhat distorted, and also the mortar join will be the weak spot. Hard fired brick does have a use as high strength wall, but in a regular house this is not needed with the building code being designed around a massive safety factor on the much weaker brick allowed and the lowest allowed mortar strength. The standard double wall construction method with brick is much stronger than any load the house will ever experience.

Now, if you are building with hollow concrete block you have a much weaker wall, simply due to the block itself being a very weak item, and with not much surface area to act as a mortar joint between them as well, meaning that wall had a much lower load rating despite being a much larger block.
The following users thanked this post: chris

4
Just Chat! / Re: Your aged care nurse?
« on: 01/07/2017 11:59:52 »
I want kind, caring and able to listen, along with patience, ability to have meaningful conversation no matter how long or how much you ramble on, and most importantly of all without any qualms about changing adult nappies, me drooling and burping all over.

She works down the road at the frail care that looked after Mom when she was terminal.
The following users thanked this post: Karen W.

5
General Science / Re: What materials can eddy currents sort?
« on: 29/04/2017 10:11:51 »
You can use it to sort out things like copper and aluminium from a waste stream after you have pulled out the ferrous objects, as the eddy current separation will take these common metal types out as the rest of the stream follows a gravity fall, you can use this to change them to side chutes for separation. Then you can further sort them using a further eddy current sorter as they do have different conductivity, so you can increase the concentration of the metals in the streams.

After that you either smelt them again and accept the impurity level increase, or do more refining and get a pure metal back. For copper that is electrolytic refining, and for aluminium it is just resmelting in an inert atmosphere and skim off the impurities off the surface.

Of course you do need a waste stream high in metal, and all preferably shredded to a uniform size so particles are mostly purer metal. The scrap stream out is going to be mostly plastic and glass, so you can burn it in an incinerator to get power, though you will need a high temperature incinerator and precipitatorsand filters to get the more toxic gases, like dioxins, to either decompose or get filtered out of the hot gas.
The following users thanked this post: hamdani yusuf

6
That CAN'T be true! / Re: Could satellite surveillance be used to follow an individual?
« on: 27/04/2017 07:36:58 »
A mirror that could image a person on the street clearly, would be around 10 Hubble mirror sized, and would have to be lifted to a geosynchronous orbit. You would need a few of them to track individuals, and your tracking would only be possible on absolutely clear, windless and cloudless days or, with a much larger mirror, in IR at night, with the same requirements for wind, cloud and added to this low humidity.

Active tracking would require megawatts of power, and that means either a good few square kilometers of solar panels, and a few tons of battery as well for the storage of power to keep it operating during orbital eclipses, or a megawatt of nuclear reactor, at around 30 tons mass for the core alone, without any shielding.

As current rocket technology can only get around 10 tons to GEO, rocket including payload, this would almost never be done. There are cheaper alternatives, I would estimate you would need around 200 of the cheapest 55 million dollar launches, per satellite, to just assemble it in LEO and then slowly, over around a year, move it up to GEO and put it into operation using the power plant to drive Xenon thrusters to provide enough energy to get up to that slow orbit.
The following users thanked this post: Srednic

7
Plant Sciences, Zoology & Evolution / Re: Is it true puppies get the best out of both parent's genes?
« on: 22/04/2017 15:47:12 »
Can agree to the pavement special being the best, generally the sortaterrier is a good indoor dog, and the sortador is a good yard dog. They are sort of like a terrior and a labrador in each case, just they are close to a generalised nondescript brown mostly dog. Best you can get.
The following users thanked this post: SquarishTriangle

8
General Science / Re: Should we ban pit bulls?
« on: 17/04/2017 07:50:02 »
They used to exist here, but it was finally realised that so few actually paid for them, plus the number of council employees that were there to enforce it, was costing many thousands of times more than the money coming in, plus it did not serve a purpose at all, so it was finally removed. There only exists a raft of laws about dog and cat inoculations and kerb laws, most of which typically are never enforced.
The following users thanked this post: SquarishTriangle

9
That CAN'T be true! / Re: Does an empty hard drive or memory card weigh less than a full one?
« on: 16/04/2017 14:32:56 »
Hard drives might lose mass due to outgassing of the volatile components of the actual materials used to make them, but this is going to happen wether or not data is stored on there, it is strictly a function of temperature, time and if the parts are spinning for the lubricants. The actual data itself will not change in mass much, easily a few orders of magnitude below the changes caused by adsorbtion of atmospheric water into the material that make it up, like the printed circuit boards, the epoxy packages and even the paint film. Even the aluminium casing will have changes in mass due to hydrogen from casting diffusing out of it. and for the newer helium filled drives this will also diffuse through the seemingly solid case in any case.
The following users thanked this post: chris

10
General Science / Re: How does the ISS detect the approach of a foreign object?
« on: 14/04/2017 20:26:51 »
They rely on the US satellite tracking system, that uses a few large ground based radar units to get information on all the objects over 10cm in diameter in orbit, which is stored as a time, position and speed, so you can calculate position of an object within a block of space with a fair degree of accuracy. This then is used as a computer simulation of objects that approach within 10km of the ISS, and these objects that are predicted to approach within these limits are then considered for further orbital path determination, so that they can get a better accuracy of the orbit.

These predictions are not too accurate long term, but are good for around 24 hours as a rough test if something will be in the same place as the ISS at the same time, so that they can plan if it will miss, will come close ( the accuracy is not that good to say it will hit) and thus they either will move the station or simply put the crew in safe confinement in the Soyuz capsules in case it hits.

In general there are multiple hits per orbit, mostly small flakes of paint from older missions, tiny things that were released in use as the plastics in spacecraft sunshields disintegrate and other stuff. Speed and relative energy of these particles varies from almost zero for stuff in the same orbit and for stuff shed from the ISS and the supply craft, to almost orbital velocity of 22kps for stuff in orthogonal orbits, which can cause some damage if they hit, but where the actual particle is very small, so only makes a scratch or a tiny dent.

But no, there is no detection on the ISS, just a lot of work on the ground.
The following users thanked this post: chris

11
Technology / Re: Is the speed of a hard drive invariant?
« on: 18/03/2016 16:50:25 »
Speed is very specific to the particular drive. While the platters inside are rotating at a constant angular velocity, the modern hard drive tries to record data at a constant data density per unit length of head travel along the track, so the data rate varies as the head moves from inside of the disc, where the lowest block numbers are, to the higher numbered blocks that are at the edge. The data rate further is going to be varied by the encoding applied to the data so it can be recorded reliably, with forward error correction and spectrum spreading data added, along with the encryption if used on the drive, so the rate will be varying around a small value that is compensated for by the buffer built into the drive, typically something like 64k, 128k or 256k, depending on the drive type and desired application.

Then you get variations caused by the drive having to move the heads, and after moving it has to check it is on the correct track, by reading some sector data, then writing when the right sector is below the head, and then reading the next sector data before the write. Also complicating things id the drive remapping bad sectors, so that you can have the next logical block to be read or written not being actually in the next physical sector, but it has been relocated to a spare block ( interspersed through the drive surface during manufacture and hidden, like so many of the internal operations, from the outside, so the drive appears as a perfect drive while in reality it is very unreliable, relying heavily on error detection and correction to get the data back and show it to the outside) so there has to be a head movement and then a few cycles to get the correct track. Tracks are so close together that the only way to get the correct one is to go to the approximate position and then move slowly while reading to get the correct track and then wait for the right block to go under the head.

Reading can be worse, as the drive often has to use error correction to get the data back despite noise, or do multiple reads to reconstruct the data from best guesses from the reads. Too many and it ( secret sauce again) will decide to reallocate the data on the block to a spare track when idle, and mark the block internally as relocated and not usable. Thus a drive which appears as perfect can go from working to unreadable very fast as the spares are used up, and the drive no longer can swap out growing defects.

But to the original question, so long as the data is coming in at a lower rate than the worst write ability the drive can keep up and write it, if it comes in the drive will buffer to the point where the buffer is full, then simply discard some data ( mostly the last lot) and return an error code of it not being able to write the data. Reading the data rate will be set by the drive, requesting faster will simply result in the drive returning as not ready until the buffer is filled with data, and the read will stall.
The following users thanked this post: chris, Ophiolite

Pages: [1]
  • SMF 2.0.15 | SMF © 2017, Simple Machines
    Privacy Policy
    SMFAds for Free Forums
  • Naked Science Forum ©

Page created in 0.075 seconds with 47 queries.

  • Podcasts
  • Articles
  • Get Naked
  • About
  • Contact us
  • Advertise
  • Privacy Policy
  • Subscribe to newsletter
  • We love feedback

Follow us

cambridge_logo_footer.png

©The Naked Scientists® 2000–2017 | The Naked Scientists® and Naked Science® are registered trademarks created by Dr Chris Smith. Information presented on this website is the opinion of the individual contributors and does not reflect the general views of the administrators, editors, moderators, sponsors, Cambridge University or the public at large.