Month: February 2017

Road Block – Changing to Virtual Reality

Road Block – Changing to Virtual Reality

The last time I posted, I was talking about learning python so I could work with the raspberry pi and that has been going well. Using what I knew from my AP and Post-AP Computer Science class I was able to learn the basics of python rather quickly which is all I need.

Unfortunately as my onsite advisor and I did hours of research into the sensors for our build, we very quickly realized that these sensors that we were looking for were either too expensive or just flat out faulty. So my advisor and I decided to set aside the hydroponic grower idea and venture into the world of Virtual Reality and within a few days we had come up with a project. I’ll be going into depth about this new project in a future post (hopefully by tomorrow). In this new project, we will be working with Virtual Reality machines, stationary trainers, Virtual Reality software and lastly the versatile raspberry pi.

The good news is that everything that I’ve done up until this point is still very useful. The programming for the raspberry pi unit will still be used and the software that I have learned to use such as Solid Works will still come into use in my engineering studies at the UA.


Breastfeeding Tips (for all the future mothers)

1. Take your Vitamins and eat healthy! 

It is very important to have a healthy diet in order to keep up your energy and to supply yourself with important vitamins. Eating fruits, vegetables, and whole grains will help with milk production. Calcium (milk, yogurt, cheese), Iron (meat, seafood, beans), Vitamin C (citrus fruits, broccoli, cabbage), and MultiVitamins will help mothers produce enough milk to supply a baby.

2. Drink Water!

Breastfeeding mothers should consume about 8 glasses of water each day. They could also drink other fluids including juice, milk, broths, tea, and soup.

3. Use a breast pump if needed.

If it is difficult for the baby to intake a significant amount of milt to gain enough wait, it might be easier to pump and then feed the baby from a bottle.

4. If you are having trouble producing enough milk, take turns. 

In a 30 minute feeding period, you can have your baby feed on one breast for 15 minutes and then the other one for 15 minutes. Or, you can feed from one breast for 30 minutes, and then the next feeding session you can feed for 30 minutes on the other breast.

5. Make sure the baby is getting enough milk. 

A newborn baby should be eating about 2 ounces every 2-3 hours. As the baby gets older, they should be eating about 3-4 ounces. If the baby is not gaining enough weight, you can try supplementing some breastfeeding time with formula or breast milk from the bottle. A baby is supposed to gain 1 ounce a day, so if they are not gaining enough, it is important to try and have the baby drink from a bottle to closely track the milk intake. By 6 months, a baby should be double their birth weight.



Water Maze: New Rats

As I mentioned in my last post, I am working with two different cohorts of rats. The first group of rats was introduced to the lab a few months ago and I have already completed multiple stages of the experiment with them: the Water Maze, Habituation, Linear Track, and currently the W-maze Alternation task. The second group of rats was just introduced to the lab last week which is why they recieved health checks.

This week, the second group of rats completed the Water Maze. The Water Maze lasts a week, and is crucial in testing the rats’ cognitive abilities, visual acuity, and motor skills. This test decides the yoked pairs (one young and one old rat). Yoked pairs are used because even though the research is focused on the interaction between the hippocampus and PFC, the research is also looking at the changes in the brain that occur through the aging process, so we pair a young rat and old rat with similar performance. Lastly, the Water Maze can also be used to assess if a rat is blind. If we find a rat to be blind, we cannot use the specific rat for the upcoming experiments.

The first few days of the water maze test spatial memory. The platform is set at a single location for these first few days. We put a paint in the pool that makes the water opaque so the platform is hidden. Around the pool of water are visual cues in hopes that the rats will take mental note of where the platform is, so that with each trial when the rat is dunked into the pool at a different insertion point, they can find the platform in a shorter amount of time. The time it takes to find the platform is recorded for each trial. Following these few days of testing spatial memory is a day of testing visual acuity. Instead of keeping the platform in a set place and hidden, the platform is visible and changes location for each trial. This solely tests whether the rats can find the platform with their eyes.

This is a digital sketch of the Water Maze I found online:



Between each trial the rats are held in temperature-controlled case. They all cozy together to keep warm. I took a picture of them because it was so cute.


Unfortunalty after completing the last trial of the water maze, one of rats was very weak and sick looking. We noticed his eyes were struggling to keep open and the color was changing from red to pale pink. His head began rocking and his body could not maintain balance. After trying to keep him stable and warm, we were told he probably had a brain tumor because of how he was behaving (these rats also get tumors often). He had to be put down. I was very sad to see him go because he was so cute and sweet (not to sound weird but I grow attached to these rats). Anyways, his brain is now in the fridge for us to examine or dissect if we wish (silver lining?!).



Weeks 2 and 3: Creating a Case Study and Confidentiality Issues

These past two weeks have been very busy in regards to my work, but also difficult to write about. After I completed all my readings regarding the science behind Social Anxiety, I began to work with my off-site advisor, Dr. Andrews, on what it takes to make a survey and what defines a Case Study.

The problem that I face in updating this blog while completing my project is that as my project progresses, the amount of information I have the liberty to share becomes further constricted. This is because when working hard to construct questions that will allow me to analyze social anxiety triggers from a wide range of students, questions and the thought process are not allowed to be told prior to the surveys being released. Also once the surveys are completed, the analyzing process can not be described in too much detail prior to my presentation to protect the results of my experiment and the confidentiality of the test subjects.

I will be sure to keep this blog updated with the various processes that I am using to go about my project with. I hope to keep this blog as detailed as possible all while protecting those helping me and reducing the possibility of error in my experiment.

Thank you all for reading and I hope you have a great rest of your week!



Day 10: Mixing With a Twist

Previously, when we have done mixing projects, the mixing has been done on the computer, and the music has stayed in its digital platform. Today, on the other hand, was a different method of mixing, one that I personally find to be my favorite way of mixing.

To get a different sound from the mixing process, we mixed the songs to tape, using a two channel tape machine, each channel being for the left and right side of the stereo respectively. We mixed the songs on the laptop, then sent the finished mixed version of the song to the tape, recording it there. Then, we played it back onto the laptop, where the song has made a full circle, but has changed in the process.

By sending a song through tape, the sound becomes more rounded out, and in a sense, becomes more human, rather than just staying in the digital atmosphere. The songs that we mixed included violins and other stringed instruments, and the artist and his band, who were in the studio as well, felt that it was necessary to mix this way to preserve the human element.

While all of the mixing was going on, I had the opportunity to meet and talk to the band that had recorded the songs, and have some laughs. I also got another day of familiarizing myself with tape machines and some more hands on experience working with one.

The process of mixing was still a bit tedious at times, but the added element of the tape machine added some depth to the process, and kept the mixing interesting.

Here is a picture of the tape machine used for the mixing process today.


Third Week

During Week 3, I spent most of my time with the medical clinic’s billing specialist to learn more about her job and how medical insurance works. I also spent some time observing infusions of medications given for Rheumatoid arthritis and Osteoporosis in the infusion clinic.

The billing specialist first talked about PQRS, Physician Quality Reporting System, and how Medicare incentivizes or penalizes physicians based on the quality of the care provided for patients and documented by physicians. This system was put in place in 2015/2016 and will be replaced by MACRA (Medicare Access and CHIP Reauthorization Act of 2015) and MIPS (Merit-Based Incentive Payment Systems) in 2017. Because the clinic is a speciality clinic, the government assesses quality based on registry-based reporting based on certain measures. For Rheumatology, these measures include Preventive Care and BMI screening, Pain assessment and follow-up, Tuberculosis screening, Glucocorticoid Management, Assessment and Classification of Disease Prognosis. Once all the reporting is sent to the government, Medicare has a value payment modifier that provides different payments to the physicians. The differential payment is based on quality of care provided to Medicare patients through performance of PQRS measures and the cost of the care that was provided.

The billing specialist and I talked about how patients are billed through Medicare and Commercial insurance plans. I learned that Medicare usually pays 80% of the cost. We went over deductibles, copays, coinsurance, and out of pocket maximums. The deductible is how much the patient must meet out of pocket before the insurance company will begin to pay based on plan benefits. A co-pay is a flat fee assigned to various services which the patient is responsible for paying as well. Coinsurance is a percentage of the allowed amount for which the patient is responsible. Out of Pocket refers to the amount that the patient must pay before insurance will begin to pay at 100% of the cost.


Day 15- An Actual post

Day 15- An Actual post

Quote of the Day: “Turns out it was simpler than I thought” – Pretty much every programmer at some point.

Music of the Day: Mostly The Who with little sprinkles of Crüe tossed in.

Today was a roller coaster. I came in with the mental state of “OK, I actually need to do stuff this week,” and pretty immediately ran into problems. I was again focusing on the teleporter, and trying to fix the issue from last week where those two specific bots couldn’t attack after going through. I tried a few things and couldn’t really get anything to work, so I decided to take a closer look at the error and go through step by step.

If you’ll remember, the error that Unity was throwing was a NullReference, which means it was looking for something that wasn’t there. I knew what line it was throwing the error at (You can double click on the error and it will show you in the code), but I could not for the life of me figure out why. The entire robot object, attack script and all, was in the DontDestroyOnLoad category, which means it persists between levels unless you tell it otherwise. SO all of the data that it intakes when it’s first initialized should remain there in every single level. So, taking a page out of my theater tech book, I worked the problem in reverse.

In theater tech, especially with lighting and sound, if something isn’t working you go through step by step and check each individual component, starting with the simplest and moving to the most complex. So with lighting for example, if a light isn’t turning on, you first check if it has power, if the power itself is on, if the cable is faulty, if the lamp has burned out in the fixture, and then move on to more complicated and harder to fix stuff, like are the power connections sound or is the lens burnt (You also check the shutters. I once had the embarrassment of going through all of that only to realize I still had the shutters in. never made that mistake again).

So in my code, I added Debug.Log statements to strategic parts. Debug.Log statements print out a message or value to the console so that you can easily read it, and are super useful for making sure that everything is called in the right order. So I worked through linearly, making sure that each individual part was being called and executed when they needed to be. And they were. So once again, I was at a loss. The only other thing I could think to do was check the individual variables, and lo and behold, I found the problem.

The problem: the Start() method is only called one time, and not when a new level is loaded. The non-jargon version of the problem: The attack is only set up the first time the bot is put into a level, not the next time. So in the first level, everything is peachy and works as it should. But after teleporting, the script neglects to find all of the necessary components again, so the game gets confused when there’s nothing for it to retrieve. The solution: use OnLevelWasLoaded, which, true to its name, calls things when a level is loaded. This still presented problems, because now all of that data was initialized in later levels, but not the first. So the final solution was to use both Start and OnLevelWasLoaded to make sure everything is set up and ready to go. With that done I finally had a fully functional teleporter for single player survival, and was ready to move on.

I only had about 45 minutes before I needed to leave, so Jeremy asked me to fix a small menu issue with highlighting choices. If you’ve ever played a game (or for that matter used any program with a User INterface), you’ve probably seen how buttons are highlighted when you hover over them. In our case, this worked fine for a mouse but was having problems when you tried to select things by controller. This was a pretty boring fix, I just had to make a call that happened before anything else to highlight the currently selected menu option, and then copy and paste it about 50 times for every single menu button across the game. And with that done, i was out for the day.

If you’ve made it this far, congratulations! If I make a post tomorrow it will probably be pretty short, because it’s my birthday and I doubt I will want to go to work for a long time. And I will be gone on Wednesday, a Green Day Concert awaits me. But I will be back on Thursday! Thanks for reading.

Day 10 

Today I washed out some of the overdyed pieces, and touched up a couple that still needed more dye. Below are some of the drying overdyed pieces. 

I then worked on painting the colored sections of the following block in fabric medium, which is like fabric paint but colorless. This will ensure that the sections that I colored in will stay colored if the quilt is washed. 

Next, I burned a screen of my motif and printed some samples. 

Week 0

My internship has not formally started because I have been busy captaining the BASIS girl’s basketball team, but this is a good opportunity to provide some background on my SP. My internship is at Tu Nidito, a local charity that provides grief support for children who have experienced the death or illness of someone close to them. I have been volunteering at Tu Nidito for more than a year now – I started in the Thursday 1 bereavement group, which meets on the first and third Thursdays of the month and supports kids who have had a loved one die, and about six months later I joined the Monday 1 CPC group, which meets on the first and third Mondays of the month and supports kids who have had parents diagnosed with cancer or another life-altering and life-threatening illness (CPC stands for “Children with a Parent with Cancer).

Before I began facilitating groups, I received a full day of training on grief, children, and Tu Nidito’s purpose. I also have other experience working with children – I babysit, tutor, and have been on BASIS’s Oregon trip three times, each time volunteering for a week as a counsellor at Camp Westwind, an outdoor science school. I’ll be going back to Oregon for a fourth time in April as part of this SP. It was the intersection of my Oregon and Tu Nidito experiences that first inspired this project – the methods of redirection and behavior modification that they teach are markedly different. Westwind’s training focused on keeping students engaged and participatory, with clear rules and consequences, while Tu Nidito’s focused on respecting student’s internal states and understanding their behavior as a part of their grieving process. The way that these different missions (education vs healing) lend themselves to different disciplinary styles immediately struck me, especially because kids can be part of both groups. I’m curious to explore how different ideas on behavior modification affect students’ behavior and how those techniques can be improved to support grieving students while still making them effective learners and members of their communities.

I start interning this Thursday!

Filter Curves and First Results

So today I finished up the method that I was describing in the last post. What I had initially started to do was to approximate the curves to get an equation that I could then integrate. Instead what I realized would be easier and probably more accurate was to just use a trapezoidal approximation since I was given lots of data points. After getting the area under each curve, I then multiplied it by the flux density for the respective filter and added them up. I then converted the flux here at the Earth to the flux near the star. I ended up getting some mixed results. On the one hand it gave a very good approximation for the inner boundary for the star Proxima Centauri(.025 Au compared to other estimates of .023), but it does not give good estimates for the outer boundary for Proxima Centauri or another star Tau Centi.

I have many theories as to why the estimation is so poor, but I mainly think that it is a combination of the fact that the current estimates of the habitable zones for these stars were probably calculated with a different(most likely more complex) method and that our model seems to be very sensitive to small changes in data which would mean small errors in magnitudes measured by the telescopes could have a large impact. The other thing that it may be is the flux requirement I found for the outer boundary might be wrong. When I used the current estimate of the habitable zones I found the flux to be (290 and 410 W/m^2) at the outer edge of the zone. The fact that these numbers are so close might indicate that the 960 W/m^2 boundary I initially found might be incorrect and that it is in fact much lower. This has become the goal for tomorrow: to check more sources about how much flux is needed for the inner and outer boundary of the habitable zone.