Time-Shift: Veritas liberabit vos
The truth shall set you free

Divers paradise

October 16th, 2023

Ever since 2015, when I was doing a vacation in Curaçao, I had the hope to come back to the Caribbean at some point and do much more diving on those awesome reefs. So when my employer change was coming up this year, I took the chance and organized a trip to Bonaire for me and some of my buddies.

Bonaire, though also one of the Netherlands Antilles, is an altogether different “animal” then Curaçao. While Curaçao is very much frequented by tourists and divers alike, there is almost no “normal” tourism on Bonaire. Since it only has a handful of very small sand beaches, it is not very attractive on the eyes of the normal tan seeking tourists. However, due to some heavy effort of the diving legend Hans Hass, the coral reefs around Bonaire have been a protected natural reserve since 1979. Because of this, the coral reefs there have stayed quite spectacular, while reefs in Curaçao and Aruba are suffering heavily from fishing, boat traffic and tourism.

We reached Bonaire from Hamburg via Amsterdam with the Royal Dutch airline KLM. Taking 16 hours (with a 3 1/2 hour layover in Amsterdam) it was quite a daunting flight. Luckily in modern times, having a full broadband of media available during the flight, passing the time was not to hard. After all, when getting up at around 2 o’clock, sleeps comes easily even in the loud and cramped environment an airliner is. So after an additional 1 hour layover in Aruba, we touched down on Flamingo Airport Bonaire. For this day, it was already quite clear that we would not do much more then get our rental car, drive to the villa that we had rented and fell asleep rather quickly 😛 .

The next day, after having breakfast at some small motel, our first stop was the car rental (to sign some papers). Right after that, we stocked up on food and utils for the days to come (most things are quite expensive in Bonaire. The only exception is vegetables and fruits, as the come directly from Venezuela). Last but not least, we got our dive gear ready, drove to the dive base and got our cylinders for the first dive.

At this point it was already quite clear that we would not be able to do more than one dive that day, as we, being new on the island, had wasted to much time with different new things. Entering the water, expecting at least some refreshing from the 32°C outside, we were shocked that the water was not much cooler (31°C). Even at lower depth’s, it always stays this way (coolest temperature during the whole vacation was to be 29°C at 30m). The promise of supreme reefs and astonishing wildlife was not broken. Right at the first dive, after putting our heads under the waves, we encountered a sea turtle. And the list of Sealife just kept filling up.

During the next days, we developed a routing of early breakfast at the villa, grabbing our cylinders at the dive base and then having two dives during the day. Every now and then we did a night dive too (being the then 3rd dive of the day). One of the special dive spots was the wreck of the famous Hilmar Hooker, where I switched from my Marco to the wide angle lens, for some extra impressive pictures.

Sometimes we adjusted the routine a bit, due to stuff happening during the day (I unfortunately developed a bad ear infection and had to pause diving for a day) and on one of the days (right in the middle of the vacation), we had a scheduled dive pause, to give our body a chance to completely desaturate at least once. That day although was filled with exploring the island over water, visiting the Washington Slagbaai national park, taking pictures of flamingos and colobris and grabbing some dinner at the famous LAC surfing area in the evening.

During the second half of the vacation, we took a trip to Klein Bonaire, where I spotted a sea horse (of course.. the only time I did not have my camera with me) during a snorkel expedition. We also managed to get a nice little sunburn, as the island is practically nothing but white sandy beach and crystal clear turquoise water.

Of course souvenirs and some stuff for ourselves should not be missing, so on one of the last days we also managed to get a bit of shopping done, buy some postcards for our friends back home and enjoy the culture of the island a bit. As our vacation was nearing its end, we intensified the diving again. That is what we were there for after all.

But all things have to end at some point, and so we found ourselves back at Flamingo airport again, hauling our luggage through the halls and into the airplane. It wasn’t easy to say goodbye to the sun and the awesome diving experiences. But now that I’ve been in the Caribbean for the second time, I was pretty sure that I’ll be back someday soon ;-).

You can find all the pictures of the dives and over water fun in bonaire in my gallery.

Technically complete

June 5th, 2023

Not to long ago (2020) I started my journey into technical diving. After my basic technical training (DTSA TEC Basic 2020) and my training for mixed gases (DTSA Nitrox** 2021) I also did a Trimix certification (DTSA Trimix* 2021) and a gas blending course (DTSA Gasmischer* 2021). So I had a very busy few years.

The eventual goal has always been, to also become a technical dive instructor. Although I already had a Nitrox TL* certification, the Nitrox TL** is the real entry into technical diving when it comes to the instructor side. This year, the course finally was offered and luckily for me, right at my doorstep. So on the 14. of April I made my way to Hemmoor to be “judged” by a Team of Nitrox TL*** in regards to my capabilities and skill level. Because this assessment is necessary and has to have a positive outcome in order to take part in the exam at all, I was understandably nervous.

On the 14. of april, the weekend began with a demonstration of the different techniques (frog, back, heli and flutter kicks, Valve drill and surface marker placement), continued on the Saturday with us pupils showing our stage skills (Bottle rotations, gas changes, re-storage of the stage) and culminated Sunday in a deep and long decompression dive with multi-gas changes. Even though most of my dives went quite well, when we where called into the room with the TL***s one by one to get out evaluation, I was still nervous. Luckily the instructors did have the same impression as I and were quite satisfied with what they saw. The final exam was to be in June, so I had at least a little time for a bit of practice.

Other then previous exams, especially the instructor courses, this exam was to be a little bit different. Because our job, after proving on the Friday of this weekend that we had gotten our stuff together, we were supposed to actually lead a DTSA Nitrox** course with real pupils. Basically proving our instructor capabilities on the subjects 😀 . Luckily for me, I knew most of my pupils from previous courses, so it wasn’t to hard to establish a rapport in the group. Other then the TEC Basic courses, there is no preparation weekend for the DTSA Nitrox** course. So the pupils immediately jump into the exam and have to prove that they were able to improve on their TEC Basic experiences from the past.

All in all, the weekend went rather well. Us instructors managed to keep the pupils alive and they managed to keep us happy. So after some challenging weekend, we all went home happy and with new capabilities (and responsibilities). This basically concludes my dive instructor training, as a Trimix TL* or Trimix TL** is out of reach by now due to insanely high helium prices. Maybe someday I’ll add the dot to the i and become a CCR instructor? But that is very far in the future.

At long last

September 6th, 2022

It’s been a while since I started my voyage to become a VDST TL** and the journey was anything but a straight line. Especially because of the corona pandemic, there have been multiple delays and cancellations on courses and exams :-(. But about a month ago, I finally took the last steps towards becoming a full blown dive instructor stared my voyage to the dive resort Gulen (Norway).

The weather was unfortunately very unpleasant the first few days, but one of the reasons for me to do the exam in Gulen was to test my metal. I got what I wished for… Never the less, all the exercises where there to prepare us for the coming courses and to press upon our future students a rich amount of personal knowledge and experience in harsher conditions. So as the course continued and the exercises got harder and plenty, we certainly gained both of the former.

About the middle of the third day, weather became much more palatable and we enjoyed both the scenery and the warmth of the sun in Norway. We also had some spectacular sunsets and very often were up far to long (the sun is up very long that far up north and sometimes you forget the time talking about diving, wrecks and.. well diving)

Also now with the weather improved, we finally where able to do what we came for: wreck dives! The gulen area does sadly lack a rich fish life (probably all eaten up by now) but the wrecks are plenty and spectacular! The first 2 wreck dives I sadly had to sit ashore, as I had troubles with my ears ;-(. Day 4 finally was the day that I got into the water for some wreck diving!

The wreck diving and also so diving at the house reef continued for the next days. By the time we were close to the finale, all the divers and instructors started to look a bit rough around the edges. Also the first pieces of equipment had their failures. Every now and then a diver came out rather wet in their dry suit :-D.

Finally (and much faster than everyone thought) the last day and the final dives were in front of us. A final dive to the Frankenwald as a “fun” dive was finishing up the course for all the participants successfully. Not everyone got their certificate yet, as there where requirements missing for some (corona made it rather difficult to get everything in order in time). But everyone was happy non the less. The challenging environment meant that only participants with solid diving experience where participating, which made the whole experience very pleasant. It was instructive and challenging but most important: It was a lot of fun!

This concludes the sports diver part of my instructor training. The instructor training for technical diving will follow most likely next year.

One, Two, Trimix

August 5th, 2021

It’s been about 8 years now, since I started my journey as a diver. From PADI OWD over CMAS**/*** to Trainer-C and dive Instructor in 2019, all of my diving “career” has been pretty much standard. Last year then I took my first steps into the area of technical diving with the DTSA TEC Basic certification. A few weeks ago finally I stepped up the game a notch by attaining my first Trimix certification (DTSA Trimix* / CMAS Normoxic Trimix Diver).

Apart from the Apnoe, Sidemount, UW photography / videography and rebreather certifications, this pretty much brings me close to having all the certifications in VDST. A DTSA Trimix** (CMAS Hypoxic Trimix Diver) is certain to follow at some point in the future. But at the moment I want to invest more time into honing my diving skills and furthering my instructor levels (diving instructor level 2 and instructor for technical diving are both in the making 😉 ).

Veni, vidi, vici

October 10th, 2019

About 2 and a half years ago, I started my long term goal of becoming a dive instructor in my dive club. Last saturday this plan was brought to a successfull end in Deep Blue Diving Fuerteventura.

In order to get there, me and my TL* aspirant buddies had to endure quite some hardships. Apart from exams under water, we also had to show that we are capable above water, both as an exam, but also since the weather conditions where anything but favorable. More than once we had to cling desperately to the boat, getting our equipment in, while the atlantic ocean was hitting us with high waves and howling wind.

In the evening, though beat and tired, there was stuff to learn and dives to prepare for the next day. You could often see us playacting out how a specific excercise should work out under water the next day. Every now and than it worked out quite differently. But due to our good teamwork and the stubborn will of everyone to see this thing through, in the end, we all managed to achive the wanted result.

Very tired but glad no to have faltered, we all landed in Stuttgart on Thuesday. I came to Fuerteventura to get my TL* license, but I came home with much more than that: A bunch of new friends :-).

Aaaannd…. Action!

September 19th, 2019

Last Sunday I had the unique opportunity and pleasure, to take part in a small feature, shot at Kreidesee Hemmoor, for the child program 1, 2 oder 3. The feature was produced by Mingamedia GmbH and the underwater recordings were done by Jens-Uwe Lamm from uw-Film.

Luckily I did not have to step in front of the camera (I’m a bit camera shy :-D) but was only there to help light the scene and provide assistance if necessary. So my main task was “Shut up and stay out of the camera”, which I’m apparently good at :-P. Julian, Marven and Helge from my dive club (Tauchsportgemeinschaft SCUBI e.V. Stade) had to do all the hard work.

Unfortunately the weather conditions and the visibility left a lot to be desired. So my lighting became more important than we originally though. Holding the light in the right direction, keeping an eye on the camera and the man behind it (so I could move where he needed me) and keeping an eye out for my buddy at the same time, is nothing if not challenging.

Spending some much needed light (Photo: Jens-Uwe Lamm from uw-film)

After the first scene at the wreck of the Hemmoor was shot, I could relax a little bit, while my buddies had to film an over water scene in a small boat that the dive center at the Kreidesee kindly lend us for the shoots. First on the pier and later on in the water. The perfect opportunity for me to get out my camera gear and take some pictures of the shooting.

Eventually both scenes where also successfully captured and we used the boat to get to the destination of our next and final shoot, the big white. Finding that shark from the surface, when you’ve never seen the landscape from this side of the water surface isn’t as easy as we’d though. But Jens showed us the right direction. Eventually the last scene was captured and Jens had the opportunity to shoot some B-Roll and make a group picture.

The group behind the feature (Photo: Jens-Uwe Lamm from uw-film)

All in all it was a very interesting and challenging opportuniy and I am greatfull to had the chance to be part of it.

New Gallery for UW Photography

October 30th, 2018

I finally got my gear together to take my Nikon D750 under water (Nauticam NA-D750). So I’ve added a new gallery to my website to show off the pictures I’m making while diving. At the moment all my pictures are made in Hemmoor (Kreidesee Hemmoor) where I do my diving on the weekends. But other diving spots will sure follow. Enjoy! And if you like them, do leave a comment and tell me what you think!

Curaçao 2015 – A vacation to remember.

November 13th, 2016

Finally found the time to add the picture of my Curaçao vacation from 2015 to the album. This truly was one awesome vacation. Curaçao… I’ll be back!

 

The connection horror or how I hacked my own data

November 29th, 2015

A lot of people know the situation: You get a new and fast Internet connection. But your provider is a support nightmare. He hands you a practically black boxed router that automatically gets its connection data from the Internet and you have no chance of ever getting this data. After all.. why would you.. isn’t it much easier this way? Well.. let me tell you a little secret the providers don’t like to be advertised that much: Not only do they push the configuration to your new router, but they can also change it ANYTIME they want. If you have a regular setup like most people it looks like this:

Network

(Given, not everyone has a NAS at home. But they become more and more common as the devices become more simple and the data people want to store (like e.g. Audio and Video Data) needs to be shared between devices in the network. So for the sake of this article lets assume the regular user has some kind of network capable storage. Technically even a smartphone or a wifi enabled HiFi system is a network attached data storage but lets keep it simple). In this kind of setup that we see in the above picture, the Router that you use is the only barrier between your data (or device that holds your data) and the Internet. Suddenly a device you though just “provides you with internet access” becomes the only thing between your privacy and total disclosure of your private data to the world! Worse yet: even if you are as naive as to assume you provider will never do you harm, will never be hacked and never be forced by the government to give them access to your data, there is hardly a month were security groups in the Internet and from companies don’t find horrific bugs in common router firmware. With the providers being the only ones who can update your router, you have to put total trust in them to do so in a timely manner. Sadly they usually are way behind when it comes to updating the devices. So obviously this is a setup that is not acceptable. A possible solution would look like this:

g10Its possible but it has a few rather bad downsides:

  • You waste power for a device you practically don’t use (the provider router).
  • The provider (or someone who hacked it) can still do stuff to the other router and close ports or mess with connections.
  • You still need to use the provider router for the SIP connection because you don’t have the login data for that.
  • Your connection speed might drop from having two firewalls and 2 NAT systems behind each other.
  • In worst case scenarios you can’t open ANY Ports towards your network because your provider doesn’t want it.

Its obvious that the best solution would be to have your own router (for me this is my Gentoo server) and telephone system (Asterisk in my setup) running that you can maintain and implement your own security plan as needed. When I switched my Internet provider this week (for a lot more speed) I had exactly this problem. They just give you a router (FritzBox) and nothing else. For me it was clear from the beginning that I was going to use my own solution as I have been for the last 4 years. This is the story of how I managed to do just that.

My first idea (that I had before I even had the thing in my hands) was to hack the router right after it had downloaded the configuration from my provider. I knew from articles in the Internet that there was a slim chance of getting a telnet daemon running on the FritzBox and connecting to that. However when the device was done downloading the data, it became clear pretty fast that this door was slammed shut by my provider. In fact there was no getting into that router from any angle. It took me the better part of a day to realize that this idea was a dead end.

I needed a new plan… and I had one. I knew from experience, that most companies don’t take security that seriously. So I though to myself: “Why should that router send all the login data encrypted over my DS Line?”. After all who really has the capabilities to sniff a very high frequency modulated signal in a cable that is mostly under ground (yes the government has, but they can just get that data if they want to). Fortunately the FritzBox has a sniffing program integrated for all Interfaces designed for customer support problems (horrifying I know but in that moment.. pure gold!). It records all packets send over a specified interface in the wireshark format. No sooner said than done I had a neat amount of PPPOE packages on my hard drive recorded during the login procedure via DSL. It didn’t take me to long to find the data that I was looking for. 3 different PPPOE connections. One for the Internet line, one for the voice channel and a third one for the TR-096 channel (provider remote access for touter configuration)! It was unencrypted as I though and the passwords and usernames where plaintext *Place facepalm and happy dance here*.

The last thing that was missing, was the username and password for the sip connection to my provider. And here I hit another dead end again. While PPP login using unencrypted PAP authentication is not that unusual, the SIP protocol has per standard an encrypted HTTP Digest challenge as login procedure. Though I could easily get the username (it was unencrypted of course :-/) it proved impossible to get the password this way (Technically it wasn’t impossible, but I would have had to put an immense amount of CPU/GPU time and energy into reverse calculating that has to a password password. Considering it turned out to be 8 characters long, that might have taken month, if not more, of a permanently running cracking program). But I was not about to give up that easily. After all as Jean-Luc Piccard once said: “Things are always impossible until they’re not!”. I needed yet another plan.

I remembered that though I did not know that password, neither did my router when I first unpacked it. I started digging into the TR-096 protocol. And there I found the weak link I was looking for. Although TR-096 uses HTTP as means of transport it is recommended to use HTTPS for obvious security reasons. My provider of course did not. When I saw the CPE management URI starting with http:// I knew I was onto the solution. I set my router back to its original state and disconnected the DSL cable. After rebooting the box, I immediately started the sniffer on the Internet line.

At first I was only getting rather useless PPPOE session data (PADI;PADO;PADR,PADS) or chunks of TCP data that wasn’t readable. I already became somewhat frustrated when the sniffer hit gold.A series of HTTP packages! I quickly put them together (they where fragmented) and the result looked something like this:

POST /live/CPEManager/CPEs/Auth_Basic/avm/ HTTP/1.1
Host: ***.***.***.***:80
Content-Length: 2776
Content-Type: text/xml; charset=”utf-8″
SOAPAction: “cwmp:Inform”

<soap:Envelope xmlns:soap=”http://schemas.xmlsoap.org/soap/envelope/” xmlns:soap-enc=”http://schemas.xmlsoap.org/soap/encoding/” xmlns:xsi=”http://www.w3.org/2001/XMLSchema-instance” xmlns:xsd=”http://www.w3.org/2001/XMLSchema” xmlns:cwmp=”urn:dslforum-org:cwmp-1-0″>
<soap:Header>
<cwmp:ID soap:mustUnderstand=”1″>100</cwmp:ID></soap:Header>
<soap:Body>
<cwmp:Inform>
<DeviceId>
<Manufacturer>AVM</Manufacturer>
<OUI>00040E</OUI>
<ProductClass>FRITZ!Box</ProductClass>
<SerialNumber>************</SerialNumber></DeviceId>
<Event soap-enc:arrayType=”cwmp:EventStruct[4]”>
<EventStruct>
<EventCode>7 TRANSFER COMPLETE</EventCode>
<CommandKey></CommandKey></EventStruct>
<EventStruct>
<EventCode>M Download</EventCode>
<CommandKey>*************</CommandKey></EventStruct>
<EventStruct>
<EventCode>4 VALUE CHANGE</EventCode>
<CommandKey></CommandKey></EventStruct>
<EventStruct>
<EventCode>1 BOOT</EventCode>
<CommandKey></CommandKey></EventStruct></Event>
<MaxEnvelopes>1</MaxEnvelopes>
<CurrentTime>0001-01-01T00:02:00</CurrentTime>
<RetryCount>1</RetryCount>
<ParameterList soap-enc:arrayType=”cwmp:ParameterValueStruct[8]”>
<ParameterValueStruct>
<Name>InternetGatewayDevice.DeviceSummary</Name>
<Value xsi:type=”xsd:string”>InternetGatewayDevice:1.4[](Baseline:2, EthernetLAN:1, ADSLWAN:1,ADSL2WAN:1, Time:2, IPPing:1, WiFiLAN:2, DeviceAssociation:1), VoiceService:1.0[1](SIPEndpoint:1, Endpoint:1, TAEndpoint:1), StorageService:1.0[1](Baseline:1, FTPServer:1, NetServer:1, HTTPServer:1, UserAccess:1, VolumeConfig:1)</Value></ParameterValueStruct>
<ParameterValueStruct>
<Name>InternetGatewayDevice.DeviceInfo.HardwareVersion</Name>
<Value xsi:type=”xsd:string”>*********************</Value></ParameterValueStruct>
<ParameterValueStruct>
<Name>InternetGatewayDevice.DeviceInfo.SoftwareVersion</Name>
<Value xsi:type=”xsd:string”>************</Value></ParameterValueStruct>
<ParameterValueStruct>
<Name>InternetGatewayDevice.DeviceInfo.SpecVersion</Name>
<Value xsi:type=”xsd:string”>1.0</Value></ParameterValueStruct>
<ParameterValueStruct>
<Name>InternetGatewayDevice.DeviceInfo.ProvisioningCode</Name>
<Value xsi:type=”xsd:string”>*****</Value></ParameterValueStruct>
<ParameterValueStruct>

….

Of course there was real data in there. I just put the stars in to cover up sensitive information. Somewhere in this chunk of data (apart from all the config data that I already had from my other sniffing attempts) if found two junks that where like the second coming for me on this day:

<Name>InternetGatewayDevice.Services.VoiceService.1.VoiceProfile.1.Line.1.SIP.AuthUserName</Name>
<Value xsi:type=”xsd:string”>*************</Value>

<Name>InternetGatewayDevice.Services.VoiceService.1.VoiceProfile.1.Line.1.SIP.AuthPassword</Name>
<Value xsi:type=”xsd:string”>*************</Value>

Bingo! The last puzzle pieces to my odyssey! As a last measure of verification, I flashed my router with a de-branded firmware and entered the data that I had collected in the appropriate interfaces (to make sure that there was no other special stuff in that old firmware that was needed to make the connections). And it worked like a charm. Even though it might not seem like such a big deal for some.. for me those two days of hacking to get my own data (after all I pay for that connection) was quite an experience in itself. Especially since I was successful! Another win for free choice and against oppression :-P.

And the moral of the story? Thank god most ISP’s are to lazy to implement real security. If all those connections would’ve been encrypted, it would’ve been nearly impossible to get all that data. Crazy and scary at the same time :-P.

Fixing bugs the right way: Be a scientist

April 9th, 2015

So lately a though started to form in my mind that has been there for quite some time but due to environment has become more and more dominant in my head. A lot of times when I see people dealing with bugs, their first reaction is “This can’t be a problem with my code!”. Though understandable to some point, this is of course bad for the project (and for the team moral). Before long you have bugs floating from one to another, being closed, reopened again and start that nasty circle all over. Most of you will probably know what kind of bugs I’m talking about. So what to do about this? The answer is simple: Be what most of you have been trained to be! Be a scientist! I’ve come about this very intriguing graphic a few days ago:

 

A-Rough-Guide-to-Types-of-Scientific-Evidence

Though technically oriented towards medical students this pictures pretty clearly what evidence you can trust easily and what might be a bit fishy. The first thing that catches the eye is that expert opinions are lowest in the list of trustworthy evidence. In other words: The phrase “This can’t be a problem with my code!” said by no matter how much of an expert is basically worthless when it comes to fixing bugs using the scientific approach. Event a hint where the problem may lay is to be regarded as such.. a hint and no more. Especially if you tell yourself that you can not be the origin of that bug always remind yourself that you can’t be sure unless proven right.

The next step on the list is an experimental approach. This is what you’ll see most in code debugging and bug hunting. Changing values or code and see how it reacts to that changes and then extrapolate from that reaction. Though most people would guess this to be a good approach it really is not. Because this way all you get is more data and most of the times more puzzles instead of solutions.There are a million things that can go wrong using this approach. The code might be time critical and only fail if run without stopping by the debugger. Values might be different using a debugging approach. And last but not least: you might simply run your debugging code on different hardware then the real thing.

The next three steps are basically only good for data collection and finding clues as to where the bug may lay. Sometimes this gives you a really good insight and helps you track the problem faster. But normally this would be the job of the QA department. They are responsible to find a way to reproduce the bug and include that report in their bug report. In 90% of the cases we will be writing deterministic software. Meaning: even if we use some kind of weak random generation using predefined seed, the software will always behave the same way when run on the same hardware and giving the same input. If it doesn’t (and believe me thats really hard to prove) then your hardware is broken. So as long as you have not proven your hardware broken and you are sure your program is deterministic, there is always a way to find reproduction steps. And once you have those you’re on a good path.

Which leaves the last two possibilities. The first one being randomized tests (which is might or might not be possible depending on your software and the way you input data) and the last one being a scientific approached review of the code.

Randomized tests are a good subject to be done automatically by your build servers. They can find bugs and you’ll always have the input data that generated those bugs (which makes finding reproduction steps pretty easy). Depending on your program this might be pretty hard to achieve, however you should at least plan in some automated testing before you even begin to write code. In an optimum case you even have the whole continuous deployment pipeline ready to use before you even start to write a single class.

After you’ve done all this and moved through all the fact finding steps you should have a pretty fair idea where your bug is hiding. If not. Start from the beginning. After all: science means that you are looking for the truth and not “your idea of the truth”. So don’t start bug hunting trying to prove it is not your code that is broken. Start bug hunting trying to prove what code is broken. If it turns out it wasn’t your stuff, all the better. Comment the bug and send it along to the code owner. And if it turned out that you did some really bad stuff: Thank god you found it. Every truth is god. And if you learn from your (or from others) mistake it will not happen again.

So once you have all the information and know where to look, start reviewing your code. I like to do this by stepping through it with a debugger and look at what I’ve really got. More often then not the data I see is not what I expected and I can find the problem really quickly. Sometimes its hard to track where that data is coming from. But at least now you know whats causing the symptoms. If you have proven that its not your code that causes the problem, but the data that comes in from somewhere else, its ok to collect all the infos in the bug report and pass it on to the person you think knows that code the best. After all he might be able to given an expert opinion :-P.