THE GOOD OLD DAYS

Previous Conferences

In the archive you’ll find information related to past Let’s Test conferences, such as presentation slide decks and Let’s Test related blog posts.

Pick the conference of choice from the left side menu. And let us know if you’re having trouble finding material from a particular conference or talk.

Enjoy!

IMG_0604

Let’s Test 2012

Thank you all attendees, facilitators, speakers and sponsors who helped make the inaugural Let’s Test conference a total pleasure! On this page you’ll find slide decks and material relating to (most of the) keynotes, tutorials and sessions that made up the Let’s Test 2012 program (with the exception of the experiential sessions that didn’t use slides).

We’ve also gathered collections of the extensive #LetsTest Twitter feed, the many fantastic post conference blog posts and a small selection of photos for your reading and viewing pleasure. If you have material that you would like to see added here, don’t hesitate to send an e-mail to johan.jonasson@lets-test.com

See you all next year!

Cheers,
Johan Jonasson

Keynotes & Tutorials
Michael Bolton – If It’s Not Context Driven You Can’t Do It Here
Rob Sabourin – Elevator Parable
Scott Barber – Testing Missions in Context From Checking To Assessment
Julian Harty – Open Sourcing Testing

Rob Sabourin – Just In Time Testing
Scott Barber – Context Appropriate Performance Testing – From Simple To Rocket Science
Rikard Edgren – Exploratory Test Design

Sessions
Alan Richardson – Testing Hypnocially
Alexandru Rotaru – Going Against The Stream
Anders Dinsen – Testing in the Black Swan Domain
Anne-Marie Charrett – Coaching Testers
Ben Kelly – The Testing Dead
Christin Wiedemann – You’re a Scientist – Embracing the Scientific Method in Software Testing
Dawn Haynes – Zen Of Software Testing
Fiona Charles – Strategies for a Successful E2E Systems Integration Test
Henrik Emilsson – Do You Trust Your Mental Models
Huib Schoots – So You Think You Can Test
Johan Åtting – Mixup Testing – A Cross Team Test Activity
Michael Albrecht – From Good To Great With xBTM
Neil Thompson – Testing as Value Flow Management – Organise Your Toolbox (pdf) & (ppsx)
Oliver Vilson – Journey To The Other Side
Rikard Edgren – Curing Our Binary Disease
Simon Morley – Testing Lessons From The Rolling Stones
Torbjörn Ryber – Effect Mapping
Zeger Van Hese – Making The Case For Aesthetics

Blogs
Ilari Henrik Aegerter – What Is the Secret Sauce that Made Let’s Test Such a Great Conference?
Scott Barber – Let’s Test Raises the Bar for CONFERences
Huib Schoots – Let’s Test 2012: an awesome conference! – (Part 1), (Part 2), (Part 3),
(Part 4)
Maria Kedemo – Reflections from Let’s Test – a conference for everyone?
Martin Jansson – A Let’s TestLab Story
Markus Gärtner – Let’s Test: If it’s not context-driven, you can’t do it here
Anders Dinsen – Let’s Test Live Blogging (Page 1), (Page 2), (Page 3), (Photos)
Magnus Wärle – Let’s Test 2012
Fiona Charles – Let’s Test 2012 – A Personal View
Christin Wiedemann – Let’s Test 2012
Mohinder Khosla – Enabling Learning through Social Media – LetsTest 2012 Conference
Jasminka Puskar – Back from Let’s Test
Yassal Sundman – Context Appropriate Performance Testing at Let’s Test
Joep Schuurkes – Let’s Test 2012 – time travel advice
David Greenlees – I attended Let’s Test, without attending!
Zeger Van Hese – Exploratory bug diagnosis
Simon Morley – Thoughts From Let’s Test
Simon Morley – Let’s Test Conference Carnival #1 (pre-conference)
Alexandar Simic – My first Let’s Test Conference
Johan Jonasson – Let’s Test – in retrospect

Twitter feed
Lets Test 2012 Aggregated Twitter Feed (pdf)
Mohinder Khosla – Tweets from LetsTest Conference 2012
Duncan Nisbet – Let’s Test 2012: Visualisation of tweets

Photos
(Many of the blog posts in the section above are also packed with photos.)

Program 2012

9044340086_8ab9921276_o

Let’s Test 2013

Thank you all who attended Let’s Test 2013. We had a blast!
Here you’ll find different resources from Let’s Test 2013.

Keynotes

James Bach – How Do I Know I’m Context-Driven?
Johanna Rothman – Becoming a Kickass Test Manager
Scott Barber – Business Value of Testing

Tutorials

James Bach – Professional Test Reporting (slides, .pdf)
James Bach – Professional Test Reporting (reference material, .zip)
Scott Barber – Improving system performance through a few minutes, every day, from everyone
Scott Barber – Rapid Performance Testing
Keith Stobie – Scripted Manual Exploratory Testing
Huib Schoots & Jean-Paul Varwijk – Thinking and Working Visually for Software Testers
Simon Morley – Lifelong Analysis Skills: For Explorers and Process Junkies Alike
Dawn Haynes – To Infinity and Beyond! Simple Sampling Techniques for Test Coverage

Sessions

Maria Kedemo – Hiring To Solve The Puzzle
Madis Jullinen – Hitting the wall – How (previous) action gaming experience
could serve as a gateway into better testing

Simon Morley – Failing: The Very Human Side of Testing
Ilari Henrik Aegerter – Observation Ninjas & Description Superheros
Huib Schoots – Set Your Course: Future Roles and Trends in Testing
Fiona Charles – Test Leadership Heuristics
Iain McCowatt – The Commoditization of Testing: Bucking the Trend
Griffin Jones – What is Good Evidence?
John Stevenson – Information Overload and Bad Decisions
Zeger Van Hese – Testing in the Age of Distraction
Alan Richardson – The Evil Tester’s Guide to Web Testing
Dawn Haynes – Talking to Triangles
Tobias Fors – Systems Thinking For The Rest Of Us

Photo collections

Let’s Test’s own photo wizard: Martin Nilsson’s photos
Anders Dinsen – My Let’s Test photos
Richard Robinson’s photos
Jesper Lindholt Ottosen’s photos

Blog posts

Mohinder’s Let’s Test 2013 blog post
Helena Jeret Mae – The GNIKCUF Awesome conference – Let’s Test 2013
Carsten Feilberg – Let’s Test 2013 – it was HUGE!
Erik Brickarp – Let’s Test – a summary
Eric Brickarp – Finding new ways to use what I learned
Pete Duelen – An introvert’s retrospective
Levente Balint – Things I Saw and Scetched at Let’s Test
A few words from the Incubators about Let’s Test 2013
Duncan Nisbet – Let’s Taste
Andreas Cederholm – Let’s Test 2013 Sum-up
Aleksis Tulonen – Passion-Driven Community
The Test Eye – Reflection from Let’s Testlab 2013
Ruud Cox – Let’s Test 2013 Sketchnotes
Richard Robinson – Let’s Test 2013 – Where unicorn tears are formed?
Jesper Lindholt Ottosen – Release now, the joy you had and let it flourish
Miagi-do – A quick wrap-up of Let’s Test
A view from the Chair with Michael Bolton
Aleksis Tulonen – Applying Six Thinking Hats to Let’s Test Conference
Aleksis Tulonen – Audio Of Several Let’s Test Sessions Shared
Parimala Hariprasad – Journey to and through Let’s Test 2013

Videos

Let’s Test YouTube channel
Let’s Test 2013 opening
Ilari Henrik Aegerter – Observation Ninjas & Description Superheroes

Twitter feed

Let’s Test 2013 Twitter Collection

Program and other 2013 conference web links

Let's Test South Africa 2015

This event took place at the beautiful Kloofzicht Hotel and Spa in Magaliesberg, November 15th – 17th 2015. On this page you’ll find slide decks and other supporting material from the conference. The page is under construction as we gather the material from the speakers, but please don’t hesitate to contact us if you’re having trouble finding what you’re looking for.

Slide decks & Workshop material

The Ongoing Evolution of Testing in Agile – Dawn Haynes (Opening Keynote)

Story Mapping – the art of contextualizing user stories into getting thin slices of deliverable value – Mark Pearl and Angie Doyle

Why Pair Testing can (and will) improve the quality of your work – Simon ‘Peter’ Schrijver

Dimensions of Testability – Ben Kelly & Maria Kedemo

Examining Your Testing Skills – Alexandra Casapu

Excel in Testing – Beverley Christensen

Photos

Let’s Test Flickr

14458430893_befd2c5a14_o

Let’s Test 2014

Archive page for Let’s Test 2014

Please let us know if you’re looking for any specific material that you can’t find, or if you have material you want to have linked from this page.

Keynotes

Jon Bach – A critical look at best practices

Tutorials

Jon Bach – Exploring best practices
Alan Richardson – Hands on exploration of page objects and abstration layer with Selenium Webdriver
Alan Richardson (additional material)
Tim Lister – Risk Management Workshop
Tim Lister (additional material)
Rob Sabourin – Test fundamentals for expert testers
James Bach & Pradeep Soundararajan – Review by Testing/a>
Huib Scoots & Ilari Henrik Aegerter – Observe, model, sctechnote, design, test

Sessions

Dawn Haynes – Realism in Testing
Huib Schoots – What testers can learn from social sciences
Anna Royzman – Quality Leader – the changing role of a software tester
Martin Hynie & Christin Wiedemann – Can Playing Games Actually Make You a Better Tester?
Pradeep Soundararajan – Testers as respected business problem solvers
James Bach – Describing what you see
Anand Ramdeo – Keeping London on the move
Lee Hawkins – Prepare yourselves for an offshoring success story
Alessandra Moreira – The DIY Guide to Raising the Test Bar

Photo and video collections

Let’s Test’s own photo wizard: Martin Nilsson’s photos
Let’s Test YouTube channel
Tuesday morning pre-keynote workout

Blog posts

James Bach – Let’s Test at Let’s Test
Richard Bradshaw – Let’s Test 2014
Alessandra Moreira – It’s all about the people
Erik Brickard – Let’s Test 2014
David Greenlees – Let’s Test Sweden
Neil Studd – Let’s Test 2014 in review
Anand Ramdeo – Let’s Test party is over – see you next year?
Helena Jeret-Mäe – Let’s Test 2014
Helena Jeret-Mäe – Becoming Healthily Uncertain Doubt by Doubt
Jon Bach – Working Towards Precision
Amy Philips – Let’s Test 2014 reflections
Ruud Cox – Let’s Test 2014 sketchnotes
Hannes Lindbom – Let’s Test Day 1
Hannes Lindbom – Let’s Test Day 2
Hannes Lindbom – Let’s Test Day 3
Hannes Lindbom – Let’s Test Let’s Summarize

15374555271_f9939bc3e5_o

Let's Test Oz 2014

Archive page for Let’s Test Oz 2014

Please let us know if you’re looking for any specific material that you can’t find, or if you have material you want to have linked from this page.

Keynotes

Fiona Charles – The battle for our hearts and minds
[More coming soon]

Tutorials

Oliver Erlewein – Hacking Performance
[More coming soon]

Sessions

Anders Dinsen – All is fair in love and war
Beth Skurrie – Pacts to the rescue!
Brian Ozman – It takes an entire village to raise a tester
Mike Talks – Deprogramming the cargo cult of testing
Sigge Birgisson – QA as in Quality Assistance
Margaret Dineen – Operation Successful, patient dead
Adam Howard – Agency in Testing – When seeing isn’t believing
Katrina Clokie – Black and white: Software testing for scientists
Let’s Test Oz – Welcome presentation

Photo and video collections

Let’s Test YouTube channel
Let’s Test Oz Flickr page

Blogs & Twitter collections

Mike Talks – The Let’s Test Oz Experience
Kim Engel – Let’s Test Oz 2014 day 1
Katarina Clokie – The Context Driven Test Community 
Katarina Clokie – Test Leadership
Scott Griffiths – Let’s Test 2014 Sydney Australia
Simon Schrijver – Down under to Let’s Test Oz
Daniel  Luschwitz – Behind Enemy Lines – My Experience at Let’s Test Oz
uTest – Top Tweets
Eventifier (collection of Tweets, etc.)
Anders Dinsen – Welcome to Oz
Aaron Hodder – The responsibilities of conference

9469051523_eeb325bcc0_o

TLT Oz 2013

Tasting Let’s Test took place on Aug 5th 2013. This is an archived page. If you’ve landed here, your probably either interested in:

Context-driven testing is an approach to testing that has been growing rapidly in the past few years. The context-driven tester is an active thinker, one who values personal skill development and rapid testing over producing boilerplate documents that nobody ever reads, or follows. In short, context-driven testing is about finding the best possible solution for each unique testing problem, and then implementing that solution in an effective and cost-efficient manner, with minimum waste. The context-driven challenge is to continuously re-evaluate and re-optimize testing to best fit the needs of the business and to get testing done as quickly as possible while still assuring that everything gets done that needs to be.

Exploratory testing, the dominating approach to testing in Agile environments, originated as a defined and teachable practice within the context-driven community, and exploratory testing practices are continuously being developed by context-driven testers.

Context-driven testing aims to give testers the tools they need to solve testing problems in any imaginable situation. If you have yet to discover this approach to testing, or if you want to take the next step in your professional development, then you don’t want to miss this conference!

Date: August 5th, 2013 Tasting Let's Test
Where: Dockside, Cockle Bay Wharf, Sydney, Australia
Early Bird Price (Until July 15th:) $350*
Standard Price (After July 15th): $400*

* Tax deductible depending on your income tax situation.

Meetups, use your promocode to receive a hefty discount.

The first thirty to register and pay to Tasting Let’s Test will automatically go into a draw to win a ticket to Lets Test Oz.

It’s a Lets Test Conference

Tasting Let’s Test is not any old conference. It stands proudly under the Let’s Test banner of conferences. What makes Let’s Test conferences so unique? Read this blog post or take a look at some of the content to find out why. More details to follow; however our speakers will be:

Keynote speaker:
Rob Sabourin  – Amibug.com Inc,  Canada
Speakers:
Tim Coulter  – Tester and founder of NoteApp, USA
Erik Petersen  – Emprove, Australia
Rajesh Mathur  – Test Delivery Manager of Cathay Pacific Airways, Hong Kong
Anne-Marie Charrett  – Test Consultant, Testing Times, Australia
Let’s TestLab hosted by:
Richard Robinson, Australia

Tasting Let’s Test Programme Details

Keynote – Rob Sabourin

Can quality products be delivered when teams, customers, users and stakeholders have conflicting values?  Rob Sabourin suggests that the notions of “On time on quality and on budget” are meaningless concepts unless you are “on purpose”.

What do people value?  Why do they value it?  How does it matter?

Rob shares some rich and varied experiences leading successful development projects with synchronized core values between stakeholders, team members, customers and user communities throughout their relationship.

Rob examines some projects which were dismal failures due to teams working at cross purposes with conflicting values.

Rob also looks at some of the most absurdly turbulent, chaotic, projects which were tamed and became glowing successes due to a deliberate focus on harmonizing a blend of business, technical, organization, team, individual and cultural values.

Explore how value sync can make a difference in your context.

Rob Sabourin P.Eng., has more than twenty-five years of management experience leading teams of software development professionals. A well-respected member of the software engineering community, Rob has managed, trained, mentored, and coached hundreds of top professionals in the field. He frequently speaks at conferences and writes on software engineering, SQA, testing, management, and internationalization. Rob wrote I am a Bug!, the popular software testing children’s book; works as an adjunct professor of software engineering at McGill University; and serves as the principle consultant (and president/janitor) of AmiBug.Com, Inc. Contact Rob at rsabourin@amibug.com.

Context Driven Testing in a Mission Critical Environment – Rajesh Mathur

This presentation is based on Rajesh’s experience of implementing and testing few large mission and/or life critical software in the companies and clients he worked for. The objective of sharing his experience is to make people learn from what was done right and what could have been done better. He hopes it will assist in learning to avoid pitfalls and simulate the successful outcomes.

Since his current role is in the aviation industry, he will take few examples from aviation. However, there will be examples from Healthcare and Finance industry too.

It is an important fact that the technological aspects of aircraft do not change as rapidly as they do in the other areas. However, the recent worldwide competition in aviation industry has affected this perspective which has impacted software testing groups as well. From mission quality perspective, software testing becomes a very important activity among other SDLC activities. During his talk, Rajesh will cover the testing strategies that are required to test such software in mission critical context.

With more than seventeen years of software testing experience under his belt, Rajesh Mathur has held technical and managerial positions in companies & clients including Nokia & Boots. Rajesh is currently a Test Delivery Manager at Hong Kong’s premium airline Cathay Pacific Airways with a mission to get software testing its right status in the Asia Pacific region. Rajesh has lived and established his career in countries like US, UK, India and Hong Kong, and has worked on many high-profile mega-projects.

Rajesh is a Fellow of Hong Kong Computer Society, a member of Australian Computer Society and a member of Association for Software Testing. A Context-Driven tester, he has been actively working on strengthening the testing community & the practice of software testing through training  mentoring and by speaking at conferences. He shares his views at http://www.dogmatictesting.blogspot.com and can be followed at twitter at @rajeshmathur.

Building a Context Driven Team – Anne-Marie Charrett

Discover the secrets to building a test team ready to handle any situation. That’s what a Context Driven Team does. Because they understand that there is no best practice, that situations inevitably change, a context driven team knows to expect the unexpected and be ready to face what challenges come their way. Like the sound of that? Want your team to be adaptable, flexible and motivated to face the demands of rapid change? Then you need to come to this talk.

Anne-Marie Charrett is a testing coach and trainer with a passion for helping testers discover their testing strengths and become the testers they aspire to be. Anne-Marie offers free IM Coaching to testers and developers on Skype (id charretts) and is is working on a book with James Bach on coaching testers. An electronic engineer by trade, testing discovered Anne-Marie when she started conformance testing to ETSI standards. She was hooked and has been involved in software testing ever since. She runs her own company, Testing Times offering coaching and software testing services with an emphasis on Context Driven Testing.
Anne-Marie can be found on twitter at charrett and also blogs at http://mavericktester.com

Agile Testing in Silicon Alley – Tim Coulter

Tim has spent the last six years hopping startups and building each startup’s testing function as they grew from small to medium sized businesses. Tim will talk about how context played a huge role in how testing was performed within each organization, and how actions performed by the testing group (mostly just him) varied widely even though each company claimed to be an Agile shop. Tim will explicitly debunk the myth that Agile is 100% automation — in fact, that often was not the case. He’ll also compare and contrast each experience — from politics to software to actual business needs — and offer suggestions on how perform testing within small and growing “pivot quick” organizations.

Tim Coulter is an American software engineer who has made a career of being the sole tester for multiple New York City startups. As a self-proclaimed “professional quitter” (don’t tell the startups that) Tim spent the last six years helping startups as they grew from fledgling code houses to 70-100 person companies. Recently, Tim struck a deal with his current employer to work part-time and 100% remotely so he can travel the world at the same time. Tim is also the creator of NoteApp (noteapp.com), an online note-taking and collaboration tool which he hopes will soon become a startup of its own.

Introducing an Exploratory Testing Framework – Erik Petersen

In this session, we’ll test some real small to medium size applications for quality and bugs. Through three main activities of collation, investigation and experimentation, we’ll move through our explorations understanding applications then experimenting with discovery and failure situations, utilizing tools when relevant.  A key to effective exploratory testing is understanding what you are doing, and the ET framework helps people understand the thinking processes involved.  While Erik will guide the session and explain the context, a large part of the testing may  come from the audience, either as ideas or even driving the keyboard. While this is accessible as an introductory session, it will also show how to perform industrial strength ET.

Erik Petersen is based in Melbourne, Australia. For most of his career, he has been involved in custom software development, now focussing on testing and quality. He has also trained and mentored many testers and developers. He is a thought leader in agile and exploratory testing, and was one of the first people to formally propose the idea of paired exploratory testing in 2001. Erik also has extensive experience with test automation and testing tools. He is recognized for his mix of practical knowledge and project experience (and excessive numbers of slides!). Visit the Software Testing Spot at www.testingspot.net Contact Erik by email via emprove@gmail.com

The Let’s TestLab – with host Richard Robinson

At Let’s Test we understand that most learning comes from hands on work. That’s why the Let’s Testlab is an integral part of a Let’s Test event. The Let’s Testlab gives testers an opportunity to try their hands at performing practical testing tasks. Watch this space as we reveal more information on the test lab closer to the event.

Richard has worked across numerous industries at all levels of testing. He enjoys working as a context-driven test consultant/contractor, and promotes a learning-centric approach to testing to his clients. He has been developing his ideas over the past few years since he decided to start thinking about his testing. Richard does not believe in best practice, just better practice than traditional methods.

1 (1)

TLT Netherlands 2014

This event has already happened. Proceed to the archive page for slide decks and other resources

On March 11th 2014, we’re opening the doors to the first ever Tasting Let’s Test Netherlands! Tasting Let’s Test are a 1-day events, designed to give you a bite-sized taste of what the Let’s Test spirit is all about.

The event will take place at De Fabrique in Utrecht, Netherlands. The full day program is available here.

We’re offering the full event at the low rate of €199 (Euro). You can register by following this link.

Keynotes

Getting Serious About Emotions in Testing – Michael Bolton

Software testing is a highly technical, mathematical, rational business. There’s no place for the squishy emotional stuff here. Or is there?

Because of ambition and risk, emotions can run high in software development and testing groups. It can be easy to become frustrated, confused, or bored; angry, impatient, or overwhelmed. Yet if we choose to remain aware of them and open to them, feelings are immediate and powerful sources of information for testers, alerting us to problems in the product and in our approaches to our work, and how we might change things. People don’t decide things based on numbers; they decide based on how they feel about the numbers. Our ideas about quality and bugs are rooted in our desires, which in turn are rooted in feelings.

You’ll laugh, you’ll cry… and you may be surprised as Michael Bolton shows the important role that feelings play in excellent testing, and how you can take advantage of the information they provide.

Michael Bolton is a software tester, consultant, and trainer with 20 years of experience around the world, testing, developing, managing, and writing about software. He is the co-author (with senior author James Bach) of Rapid Software Testing, a course that presents a methodology and mindset for testing software expertly in uncertain conditions and under extreme time pressure.

For Those About To Rock – Henrik Andersson

Let´s have a look at Let´s Test. What is all the fuzz about and what makes Let´s Test rock? In this talk i will give you our thoughts, vision and dreams when creating Let´s Test. What have you experienced during todays tasting and what is there left to experience at the full fledged conference. I will talk about what we are doing to help our community get stronger and sharper. Let´s Test is “for testers by testers” but what does that really mean?

Henrik Andersson is Co-founder and CEO of House of Test Consulting. Besides House of Test, Henrik also co-founded Let´s Test, Europe´s first annual conference on context driven testing. The conference has set a new bar for testing conferences around the world. 2013 has was a busy year; Henrik founded the local user group ConTest in Malmö, Sweden, supported the Let’s Test expansion to Australia and co-founded the International Society For Software Testing (ISST). ConTest is taking the concept of facilitated peer conferences to a broader local community and the ISST was founded with the mission to put back common sense back into testing by advocacy, development of testing skills and growing our community.

Sessions

What testers can learn from the social sciences – Huib Schoots

Testing and informatics are often seen as exact or physical science. People perceive that computers always do exactly the same. This gets reflected in the way they think about testing: a bunch of repeatable steps to see if the requirements are met, but is that really what testing is all about? I like to think of testing more as a social science. Testing is not only about technical computer stuff, it is also about human aspects and social interaction. IT is often considered as a technical science or engineering. Traditionally testers are techies who focus on analysing requirements and turning them into a series of test cases. Some also analyse product risks and write (master) test plans. Focus is on technical and analytical skills. But testing requires a lot more! Testing is about attitude, skills, communication, behaviour, collaboration and (critical and/or systems) thinking.

The seven basic principles of the Context-Driven School tell us that people, working together, are the most important part of any project’s context. That good software testing is a challenging intellectual process. And that only through judgement and skill, exercised cooperatively throughout the entire project, are we able to do the right things at the right times to effectively test our products. In these principles there is a lot of non-technical stuff that has a major influence in my work as a tester.

This talk gives insight in why testing is a social science. It also gives some examples of what a tester can take away from social sciences. Anthropology teaches us about how people live, interact, something about culture. Education/didactic helps acquire new or modifying existing knowledge, behaviours, skills, values, or preferences. Sociology teaches us empirical investigation and critical analysis and gives insight in human social activity. Psychology is the study of the mind and behaviour and helps testers understand individuals and groups. Philosophy is interesting, because, as James Bach once wrote in Philosophers of Testing – James Bach: “As I periodically remind my readers and clients, I am a context-driven tester. That requires me to examine the relationship between my practices and the context in which I should use those practices. I don’t know how anyone could be truly context-driven without also being comfortable with philosophy.

Huib Schoots is agile test consultant at codecentric where he shares his passion for testing through coaching, training, and giving presentations on a variety of test subjects. With almost 20 years of experience in IT and software testing, Huib is experienced in different testing roles. Curious and passionate, he is an agile and context-driven tester who attempts to read everything ever published on software testing. A member of the Dutch Exploratory Workshop on Testing, black-belt in the Miagi-Do School of software testing and co-author of a book about the future of software testing, Huib maintains a blog on magnifiant.com and can be found on twitter: @huibschoots

Drawing to Learn – Sketching For Testers – Ruud Cox

As we as testers test a product, we build mental models of the product. We do this by investigating and evaluating the product from different perspectives. But mental models are volatile, not very reliable and often don’t fit in our minds. That’s why we document these mental models in order to prevent them from being lost or to use them as a basis for discussions or for further investigation.

There is a common notion that information is best represented by text and maybe some diagrams. These text documents are written at the beginning of a project or at the beginning of a phase but they are almost never used after that during a project. What if we would start mapping our mental models into visual models and use these models while we test the product at hand? Wouldn’t it be more effective if visual modeling was an integrated part of the learning cycle that testing is?

In our daily work as testers we use all kinds of automation tools to support our testing efforts. Among them are many visualisation tools. But scientific studies show that sketching with a simple combination of pencil and paper might lead to better results.

The purpose of this presentation is to explain the power of drawing and how it benefits the iterative testing process. Examples from a real project will be used during this presentation to illustrate the points being made.

Key learning points:
­
* What is drawing and sketching?
­
* The power of visualization
­
* How drawings can make the learning cycle more effective.

Ruud Cox is a test consultant with extensive experience in the embedded systems industry. In the first part of his professional career he worked, with much enthusiasm, as a software developer. In 2001 he discovered his real passion; software testing. He considers himself a context-driven tester who considers the testing profession a tour in which relevant subjects must be studied and practiced in order to become a better tester. He worked for companies in various domains such as consumer electronics, semiconductors, lighting and healthcare. He currently works for Improve Quality Services, a provider of consultancy and training in the field of testing. To share his ideas, knowledge and experiences, he publishes on his blog and tweets as @ruudcox.

Artful Testing – Zeger Van Hese

Art and testing may look like an odd couple. True, Glenford Myers combined both in his book “The Art of Software Testing”, but the art in there was strictly limited to the title page, since the term isn’t even mentioned once throughout the whole book. It referred to skill and mastery, of course, not to an aesthetic experience. More recently, Robert Austin and Lee Devin published “Artful making” which mainly addressed software development and its resemblance to art. This got me thinking: what about artful testing?

In this presentation I investigate what happens when we infuse testing with aesthetics. Can the fine arts in any way support or complement our testing efforts? With some surprising examples, I illustrate that I think they can.

The tools used by art critics, for instance, can be a valuable addition to our tester toolbox. They enable us to become software critics, engaging in demystification and deconstruction. Testers can also benefit from studying art and looking at it. After all, this largely resembles what we do when we are testing: thoughtfully looking at software. Art carries the risk of being mistaken for superficial “look and see”, as does testing: we look; we see what’s there – or we believe we do. But looking at something in ways that make sense of it calls for much more than that. It appeals to our experiential and reflective intelligence. Art feeds and stimulates the tester’s hungry eye.

As we are overloaded with greater amounts of information than ever before, our ability to find meaning in things surrounding us involves a complex set of thinking skills. Naming what we see is one of them. Analyzing context based on personal association and perspective, cultural knowledge, interpretation, evidence, imagination, exploration and risk is another. These questioning and reasoning strategies used in evaluating art can be applied in testing too. This is where testing and art can meet. Good testing should be artful, in so many ways.

Zeger Van Hese (born and raised in Belgium) has a background in Commercial Engineering and Cultural Science. He started his professional career in the movie distribution industry but switched to IT in 1999. A year later he got bitten by the software testing bug (pun intended) and has never been cured since. Over the years, he developed a passion for exploratory testing, testing in agile projects and, above all, continuous learning from different perspectives. He was the 2012 Eurostar program chair and recently founded his own company, Z-sharp, dedicated to help clients on the path to smarter testing. He is co-founder of the Dutch Exploratory Workshop on Testing (DEWT), founding member of the ISST, muses about testing on his TestSideStory blog and is a regular speaker at conferences worldwide. Contact Zeger at zeger@z-sharp.be

Embracing Exploratory Testing – Carsten Feilberg

Despite ET is not at all new, there are still many myths and misunderstandings about it, like: it’s a black box testing technique, it’s something magical done by testers after they executed the real tests, it’s not structured and a few more. We will not investigate why some people have gotten such a wrong idea of what Exploratory Testing is, but we will try to set the record straight about how Exploratory Testing is in fact very structured, disciplined, accountable and much more real testing than anything else. Furthermore we will take a look at tools and methods to help you guide it, manage it and report from it.
We will in particular take a look at some popular heuristics, techniques for note taking and get our heads around the low-tech dashboard and SBTM/MTBS.

After this “crash-course” you can give it all a try in the test lab exercise: Practice Exploratory Testing.

Carsten Feilberg has been testing and managing testing for more than a decade working on various projects covering the fields of insurance, pensions, public administration, retail and other back office systems as well as a couple of websites. With more than 18 years as a consultant in IT his experience ranges from one-person do-it-all projects to being delivery and test manager on a 70+ system migration project involving almost 100 persons. He is also a well known blogger and presenter on conferences and a strong advocate for context-driven testing. He is living and working in Denmark as a consultant at House of Test.

The Test Lab – Pascal Dufour & Ray Oei

Bring your laptop!

The Test Lab is the place to explore, test, play, learn, meet, confer, have fun and make friends… Many participants at conferences are interested to know what YOU can do as a tester. By sharing experiences, techniques and test ideas with those next to you in the Testlab, you can learn so much more. Speakers are urged to illustrate their talk, to show how it can be applied in practise. Everyone are welcome to join the Testlab. Continue to build your reputation as a tester and do that in the testlab!

The Test Lab gives attendees the opportunity to test real systems with various kits and tools, in a live, practical environment. It’s an interactive place, providing direct hands-on experience for attendees to put their testing skills into practice.

The Lab sees live testing, games, competitions, expert sessions and is a really fun place to hang out, meet new friends and learn something new outside of the conventional classroom setting.

Pascal Dufour is a passionate tester and Scrum Master at codecentric. Pascal has a passion for agile projects where he tries to implement pragmatic test strategies combining agile and context-driven testing. With over ten years of experience at large international companies, Pascal has experience with different types of testing from embedded software to system integration. Enthusiastic and creative, he tries to make testing more fun, making his work visual and as simple as possible. He helps team members improve their efforts in Scrum emphasizing ethics, commitment, and transparency. Motivating people to use a dynamic approach to testing. He believes teams should contain all disciplines and work together to create solutions that solve problems. Pascal maintains a blog on pascaldufour.nl.

Born and bred in Amsterdam, The Netherlands. Ray Oei started out as a Physical Chemistry major in the mid-80′s, where he got interested in the art of Molecular Dynamics. He did research on the behavior of fluids with simulations on supercomputers. To the end of his study he started working in a little IT company. Where he did the design, programming and project management of PBX software for companies like AT&T, Ericsson and Philips. Testing was still in its infancy back then… and he was also responsible for that. So he learned the hard way.

Nowadays he is called a “test consultant”, he is a coach and trainer. He is still a geek in nature and loves technical stuff and areas like performance testing. But also the psychology of man and the influence that has on the craft of testing has his interest.

Ray is a (lead) instructor with the Association for Software Testing for the different BBST courses. He has a blog (but is not a very active writer), tries to maintain the DEWT blog and can sometimes be found on twitter: @rayoei.

Practice Exploratory Testing – Carsten Feilberg

This session is a laboratory exercise where the participants will practice Exploratory Testing using heuristics, competing on note-taking and collecting information. With the outset in the presentation Embracing Exploratory Testing the participants will do a 30 minute testing exercise followed by a debrief / open season for questions, to wrap up our learnings.

Carsten Feilberg has been testing and managing testing for more than a decade working on various projects covering the fields of insurance, pensions, public administration, retail and other back office systems as well as a couple of websites. With more than 18 years as a consultant in IT his experience ranges from one-person do-it-all projects to being delivery and test manager on a 70+ system migration project involving almost 100 persons. He is also a well known blogger and presenter on conferences and a strong advocate for context-driven testing. He is living and working in Denmark as a consultant at House of Test.

Facilitation – Peter Schrijver

All Q&A (or Open Season) after the talks will be facilitated. Facilitation is key as the Q&A is hopefully highly interactive. Discussions are often engaging and can be intense. Intense discussions are permitted as long as they remain professional and relate to the presented material, not the presenter. The facilitator manages the discussion and presentation questions. The facilitation allows for diverse conversation on the topic and ensuring everyone has an opportunity to speak.

Open Season Discussion
The facilitated discussion is also called “Open Season” and uses the coloured index cards provided to each attendee.

  • Green: The New Stack card signals the facilitator that you have a question or comment unrelated to the current discussion thread.
  • Yellow: The On Stack card signals the facilitator that you have a question or comment that relates to the current thread of discussion
  • Red: The Burning Issue card is only to be used when you are urgently compelled to interrupt a speaker. It can be a point-of-order, an argument, a problem with facility acoustics, or something you need to say quickly because you have been provoked in a meaningful way. If you use your red card, the facilitator may confiscate it for the remainder of the conference – so use it wisely.
  • Blue: The Off Track card is only to be used when a participant feels the discussion has gone off track and needs to be brought back to the original discussion thread.

More information on facilitation and k-cards can be found here: http://testingthoughts.com/blog/26

Peter is a very experienced all-round tester, who has worked since 1997 as tester, test coordinator and test manager. He has several years of experience using SBTM as a test approach. Since 2005, Peter works as an independent consultant. He visits annually at least two conferences and two training sessions to keep his knowledge up to date and where necessary, broaden/deepen his knowledge. Peter is also an active member of TestNet and (co-founder) of the Dutch Exploratory Workshop on Testing. In these communities of enthusiastic testers he is active with peers and discuss with them on the testing profession to keep up to date and improve themselves. Peter is a very experienced facilitator.

15682327200_aa32320966_o

TLT South Africa 2014

This event has already taken place. To view slide material and other things from the conference, please visit its archive page.

On November 14th 2014, we’re opening the doors to the first ever Tasting Let’s Test South Africa! Tasting Let’s Test are one day events, designed to give you a bite-sized taste of what the Let’s Test spirit is all about.

The event will take place at Sunnyside Park Hotel in Parktown, Johannesburg, South Africa.

We’re offering the full event at the low rate of R1300 (Rand) + VAT. You can register by following this link.

Program

Keynote

Rob Sabourin : Value Sync

Can quality products be delivered when teams, customers, users and stakeholders have conflicting values? Rob Sabourin suggests that the notions of “On time on quality and on budget” are meaningless concepts unless you are “on purpose”.
What do people value? Why do they value it? How does it matter?

Rob shares some rich and varied experiences leading successful development projects with synchronized core values between stakeholders, team members, customers and user communities throughout their relationship. Rob examines some projects which were dismal failures due to teams working at cross purposes with conflicting values.

Rob also looks at some of the most absurdly turbulent, chaotic, projects which were tamed and became glowing successes due to a deliberate focus on harmonizing a blend of business, technical, organization, team, individual and cultural values.

Explore how value sync can make a difference in your context.

Robert Sabourin has more than thirty-two years of management experience, leading teams of software development professionals. A well-respected member of the software engineering community, Robert has managed, trained, mentored and coached thousands of top professionals in the field. He frequently speaks at conferences and writes on software engineering, SQA, testing, management, and internationalization. The author of I am a Bug!, the popular software testing children’s book, Robert is an adjunct professor of Software Engineering at McGill University. Robert is the principle consultant (&president/janitor) of AmiBug.Com, Inc.

Session Talks

Carsten Feilberg : Practical Chartering

Exploratory testing is about optimizing the value of our work, but how do we guide ourselves to accomplish that? This talk is focusing on how to write charters – mission statements – that point out what is important to investigate, while they still trigger our natural curiosity and creative mode to search beyond the written words.

Carsten Feilberg has been in IT for more than 20 years, 14 of which has focused on test and test management. His experience ranges from small projects to large-scale programs. People and human interaction plays a vital role in his work, being a PSL graduate and studying behavioral economics in his spare time. He is a well-known blogger and presenter at conferences, strongly advocating common-sense in programming and testing and applying the context-driven principles. He lives and works in Denmark as a consultant at House of Test. 

Ben Kelly : What Do You Mean Agile Tester?

I don’t make a distinction between Agile Tester and Context Driven Tester. I’m a tester. I just happen to work with a team running a Scrum-ish flavour of Agile. We have our various rituals that are observed, standup, demos, sprints, retrospectives. ‘Where does testing fit into Agile’ seems to be an oft heard query. A short and flippant answer might be – everywhere and it happens naturally. If you’re being snarky, you might add ‘unless you’re a tester who is locked into an incompatible set of testing rituals’. The truth is that there are as many variables in team dynamics as there are in the projects they work on. Where you fit might differ based on things like the skills the group already has, how long the group has existed before you arrived, their openness to having a new team member and your skill set relative to theirs (to name a few). I’d like to spend some time talking about my experiences with the group I work with at eBay, the challenges we face, what works and what we’re working to improve. I like to think it will be illustrative for those who have never worked in an Agile context before and possibly enlightening for other testers who do work in an Agile context, but who differ in their approach.

Ben Kelly is a software tester at eBay in London. A tester for over a decade, he has done stints in various industries including Internet statistics, insurance, online language learning. When he’s not agitating lively discussion on other people’s blogs, he writes sporadically at testjutsu.com and is available on twitter @benjaminkelly

Johan Jonasson : Why Should You Care About Test Strategy?

Honestly, do you really care what your company’s test strategy says? Not even a little bit? Well, I don’t blame you. After all, a test strategy is just a useless boilerplate document detailing things like start and end criteria for different test phases, right? Wrong! A proper strategy shouldn’t have to be locked away in a useless old document that nobody reads. It should be visible, useful and known.

My idea of a good test strategy is one that will help me focus on the most important risks, one that guides my thinking as I test the product, and enables my results to withstand scrutiny. To achieve those things, my strategy needs to be practical, explainable and to the point. In this session, we will explore heuristics and tools that will let you create strategies for both yourself and your team. Strategies that can actually be used by real testers.

Johan Jonasson works as a test consultant, trainer and coach through House of Test. He specializes in helping organizations, teams and individuals to both broaden their approach to testing and to adopt a more agile mindset to traditional testing tasks. He regularly attends and speaks at national and international conferences.

Johan is a long-time active member of the context-driven testing community. He is one of the originators of the Let’s Test conference on context-driven testing and a long-time member of the Association for Software Testing through which he sometimes co-teach the well-known Black Box Software Testing course series. Johan also blogs regularly about testing at http://blog.johanjonasson.com.

Chris Blain : Intro to Big Data Testing

Map reduce, event streaming, interactive queries, machine learning, and cluster computing in general, are more and more common parts of the technology stacks that testers are must tackle as part of their test efforts. This session will provide an introduction to common technologies being deployed today as well as lessons learned from my experience testing such systems.

Chris Blain is a consultant who has more than fifteen years of experience working in software development on projects ranging from embedded systems to web applications. He is a former board member of the the Pacific Northwest Software Quality Conference, and recently starting speaking at conferences such as CAST. You can follow Chris on Twitter as @chris_blain.

Cindy Carless : Synchronicity, Serendipity and the Pursuit of Excellence in testing

Serendipity: the faculty or phenomenon of finding valuable or agreeable things not sought for.
Synchronicities are those moments of “meaningful coincidence” when the boundary dissolves between the inner and the outer. Touching the heart of our being. http://www.awakeninthedream.com/wordpress/
The perfect example occurred while preparing this… The article Cindy took this from is entitled. “Catching the bug of synchronicity”.
This merging of Cindy’s inner and outer realities has been most evident in the past 4 years she has been focussed on testing as not only a career, but a craft.
Cindy spent a year studying NLP with the intention of changing careers and becoming a life coach… What she discovered was that NLP was the pursuit of excellence and that her learnings and grappling with trying to apply NLP principles and practices have strong parallels to her learnings about testing and grappling with trying to apply those skills every day.

Cindy Carless has developed a diverse career over nearly 30 years. Cindy started in accounting and financial management then moved in to training and facilitation and most recently software testing. Cindy’s IT career started with a development house on a Y2K project for a fleet management system. Roles as an implementation consultant and project manager for an enterprise requirements planning (ERP) system and a business analyst on a banking platform project followed. Cindy has most recently been employed as a test analyst in the banking industry. Cindy subscribes to the context driven school of testing and is continuously learning how to hone her skills in the testing craft. 

Tim Coulter : Automation Is Like Beating Dumb Kids With A Stick

Automation is a crappy because you probably won’t get it right the first time. But that’s okay, you’ll learn, and probably do it better the next time around. But that’s crap because you’ll probably never have time to refactor it. But that’s okay because if you’re diligent you can stay on top of test failures and fix them as soon as possible. Which, as you might expect, is crap because the test code keeps on growing. But that’s okay, you can always get the developers to help. Which is crap because they don’t have any time. But that’s okay, you can always get a day (only a day…) where the developers focus on the test code. Which is crap because they don’t know your design decisions and so cause a lot of coding problems. But that’s okay, if you’re lucky, you might just get some time to refactor…

Automation is hard. It’s even harder when the customers of the automation are the developers themselves. In my experience they’ve been hands off, always expected the automation to “just work”, and never understood the complexity or design decisions of what was being built. When they offered to “fix things” — perhaps while you were on a week’s vacation at a testing conference — they’d add new features to the framework in a way that was extremely frail, and only supported the short-sighted goal they set out to finish (say, a specific test passing). Non-obvious features like stability were not considered in their changes. In short, the value of automation is inherently tied to your team structure, and I’ll use this talk to discuss my experience with team structures that just didn’t work.

Tim Coulter is an American software engineer who has made a career of being the sole tester for multiple New York City startups. As a self-proclaimed “professional quitter” (don’t tell the startups that) Tim spent the last six years helping startups as they grew from fledgling code houses to 70-100 person companies. Recently, Tim struck a deal with his current employer to work part-time and 100% remotely so he can travel the world at the same time. Tim is also a software developer, successful entrepreneur and Bitcoin enthusiast in his spare time, and his most recent project is the price tracking site, https://bitcoinindex.es.

Tasting Let's Test Benelux

We’re proud to announce Tasting Let’s Test Benelux 2015, a one-day testing conference with an awesome lineup! TLT Benelux will take place on Tuesday March 10th 2015, at the Mezz in Breda, NL.

This is one event you don’t want to miss! Head on over to the registration page. If you register by the end of 2014, you’ll qualify for our Early Bird pricing at only €150 (regular price €199) for the full event, all inclusive with drinks, lunch and a simple rock and roll dinner afterwards!

Keynotes

Testing & Checking Explained – James Bach

In Rapid Software Testing methodology, we make a strong distinction between testing and checking. We do this because otherwise projects will have little respect for testing, and will pressure testers to behave robotically. Checking is short for “fact checking” or “output checking.” It is an evaluation activity that can be completely automated. Testing, on the other hand, can be supported by tools, but not automated. In other words, programming and testing are non-algorithmic, whereas compiling and checking are algorithmic. But just as compiling is irrelevant without programming, checking is irrelevant without testing. Checking is an important part of testing.

Since many people struggle with this distinction, in this talk I will show and discuss examples of testing that is not checking, testing that has a little checking, and testing that is mostly checking. I will show that checking and testing can live happily together.

James Bach is founder and principal consultant of Satisfice, Inc., a software testing and quality assurance company. In the eighties, James cut his teeth as a programmer, tester, and SQA manager in Silicon Valley in the world of market-driven software development. For nearly ten years, he has traveled the world teaching Rapid Software Testing skills and serving as an expert witness on court cases involving software testing. James is the author of Lessons Learned in Software Testing and Secrets of a Buccaneer-Scholar: How Self-Education and the Pursuit of Passion Can Lead to a Lifetime of Success.

Sessions

A nest of Tests – James Lyndsay

In the first half of this double-length hands on session, James Lyndsay will show you how to use simple datasets and tools to generate thousands of dull testing experiments and collect any number of individually uninteresting observations. He’ll propose and demonstrate ways that we can aggregate our measurements to show us something more interesting, and ways that we can adjust our design to investigate what we find.

As soon as we can, we’ll break out to play in the TestLab, and go to work on a specially set up piece of software; generating and executing bulk tests, visualising the output, and considering what to investigate next. At the end of the session, we’ll build a gallery of data, tricks and pictures to help us share our ideas and discoveries.

James has been testing since 1986, and has worked independently since setting up Workroom Productions in 1994. As a consultant, he’s worked in a variety of businesses and project styles; from retail to telecommunications, from rapidly-evolving internet start-ups to more traditional large-scale enterprise. He’s worked to technical requirements for companies that make and sell software, to commercial requirements for companies that buy and use software, and to unexpected requirements everywhere. He’s been in and out of agile (and Agile) teams since 2002. James was an internal irritant to the ISEB exam process for five years, is a regular speaker and occasional teacher, runs LEWT (the London Exploratory Workshop in Testing) and has won prizes for his papers.

Challenges of implementing CDT in a large organization – Jean-Paul Varwijk

Imagine you are living in country that for the most part still heavily trusts and believes in standardized software testing and test certification. And just is getting exposed to Agile Testing and Context Driven Testing in since the last decade.

In 2011 and 2012 Rabobank International got its exposure to the Agile- and Context Driven Testing virus. Externally through five days of Rapid Software Testing for all testers (June 2011), master classes and a train the trainer in Exploratory Testing (March 2012) both by Michael Bolton. Internal exposure came mostly from two enthusiastic software testers of which one is Jean-Paul Varwijk.

In this presentation Jean-Paul will tell his story of how context driven testing at Rabobank evolved while introducing agile at the same time. He will tell where it stands now and how he thinks it could develop in the future. The presentation has three angles. One angle describes the changes on the organizational level. For instance: creation of a new Test Policy, organizing an annual Test Event, and development of new Role Descriptions.

The second angle is on a more personal level. What did it mean to be one of the so-called ‘figureheads’ of Context Driven Testing? The third angle is about the future of Context Driven Testing in a larger organization. I want to discuss my view on this and tell / learn what could be done to help Context Driven Testing getting embedded and accepted in organizational (testing) culture.

Jean-Paul Varwijk is a senior test analyst at Rabobank International and owner of Arborosa Software Consultancy. Jean-Paul has a broad experience in software testing in the financial sector. He considers himself to be a member of the Context-Driven School of testing and is a member of the Dutch Exploratory Workshop on Testing. He regularly participates in workgroups, has a blog (www.arborosa.org), can be found on twitter (@arborosa) and Skype (arborosa). He regularly speaks at international software testing conferences and welcomes people to confer and approach him for questions. And finally Jean-Paul is one of the founding members of the International Society for Software Testing (www.commonsensetesting.org) and promotes an approach to software testing that emphasizes value and the role that skilled testers play in its delivery.

Helping the new tester to get a running start – Joep Schuurkes

When a new tester joins your team – or if you are the new tester joining – the question is how to get this new tester up to speed as effectively as possible. There are so many things to learn – about the application, the way of working within the team, the project, etc. In a way it’s quite similar to learning how to navigate a new city. In the beginning you’re overwhelmed by the complexity, by all the stimuli and impressions. Then you start to discover structures and patterns; things begin to fall into place.

However, if we explore this analogy further, we will see that the most common ways to get a new tester up to speed fall short. We don’t do justice to the complexity, we barely help with identifying patterns and structures and we shackle the new explorer instead of empowering him.

Luckily there are better ways to get someone up to speed and the analogy of navigating a new city proves helpful in identifying these. And as it turns out, those alternatives share quite some properties with good testing in general.

Armed with a degree in Philosophy Joep entered the testing profession in 2006. About two years later he discovered context-driven testing and knew he was where he needed to be. The first six years of his career he spent at a contracting firm called Qquest, where he tested in telecom and insurances. He also taught general and chain testing courses – enjoying that a great deal. That experience also helped him when presenting at TestNet and Testbash. In 2013 Joep began working for Rabobank International, where he participates in lateral and interdisciplinary working groups. Later that same year he was invited to join the DEWTs, an invitation he was honoured to accept. In January 2015 Joep will begin a new and interesting challenge as a QA Engineer at Mendix. Joep has a blog (testingcurve.wordpress.com) and can be found on twitter as @j19sch.

Testing’s next automation model – Andreas Faes

Everyone models. We all construct mental models of how the world works: it is the human mechanism of coping with the infinite complexity of… well, everything. Testers can use these models as drivers for their exploratory testing, learning and remodeling as they go along.

But what does this mean for test automation? Is it possible to use these implicit models and turn them into explicit models to create an automation framework that precisely fits your project’s needs? Which models of ‘what test automation is’ exist in the minds of others (and in the mind of the automator)? What are the constraints of using these implicit models? How do model shifts impact our automation?

This presentation aims to reflect on these questions, and wants to explore (in a non-technical manner) what this thinking in models can contribute to the life of the test automator.

Andreas Faes is a Test Tool Specialist at MIPS. He has been passionate about testing since 2008, and has specialized into test automation with a special interest in open source technologies. He is a father, technology enthusiast, and a regular speaker at different conferences (CTG eXperience Day, Belgium Testing Days, EuroSTAR).

Automation in DevOps and Continuous Delivery – Pascal Dufour

In this discussion track we will have a long facilitated discussion about testing in Continuous Delivery and DevOps after a short explanation and experience report. If you want to know what these concepts mean, how these new concepts will influence testers in their work: this is the track you don’t want to miss! If you already know about this or you have experience with it, we invite you to share your experiences with the audience.

Pascal Dufour will explain how DevOps and Continuous Delivery could work in your organisation. He will shine a light on the history, the process and tooling used to successful testing products. Pascal will explain how testing could work in an DevOps and/or Continuous Delivery environment using pairing in testing and test automation.

After setting the stage explaining these concepts, Pascal will zoom in on his recent project where DevOps and Continuous Delivery was implemented. The product consist of Mobile and web technology to offer customer loyalty services. Automated checks played a big role in unit testing, BDD style service layer testing and infrastructure testing.

Pascal Dufour is a passionate tester and Scrum coach at Validate-it (http://www.validate-it.nl). Pascal has a passion for Agile projects where he tries to implement pragmatic test strategies combining Agile and context-driven testing. With over ten years of experience at large international companies, Pascal has experience with different types of testing from embedded software to system integration. Enthusiastic and creative, he tries to make testing more fun, making his work visual and as simple as possible. He helps team members improve their efforts in Scrum emphasizing ethics, commitment, and transparency. Motivating people to use a dynamic approach to testing. He believes teams should learn, try, experiment and work together to create solutions that solves problems. Pascal maintains a blog on pascalaufour.nl and tweets as @Pascal_dufour

The Let's Testlab

Bring your laptop and some extra devices to test!

The Test Lab is the place to explore, test, play, learn, meet, confer, have fun and make friends… Many participants at conferences are interested to know what YOU can do as a tester. By sharing experiences, techniques and test ideas with those next to you in the Testlab, you can learn so much more. Speakers are urged to illustrate their talk, to show how it can be applied in practise. Everyone is welcome to join the Testlab. Continue to build your reputation as a tester and do that in the testlab!

The Test Lab gives attendees the opportunity to test real systems, robots, various kits and tools, in a live, practical environment. It’s an interactive place, providing direct hands-on experience for attendees to put their testing skills into practice.

The Lab sees live testing, games, competitions, expert sessions and is a really fun place to hang out, meet new friends and learn something new outside of the conventional classroom setting.

Our Testlab will be run by James Lyndsay & Bart Knaack who have experience doing this at conferences like EuroStar, StarEast, StarWest, Agile Testing Days, Let’s Test and many, many more….

Facilitation

All Q&A (or Open Season) after the talks will be facilitated. Facilitation is key as the Q&A is hopefully highly interactive. Discussions are often engaging and can be intense. Intense discussions are permitted as long as they remain professional and relate to the presented material, not the presenter. The facilitator manages the discussion and presentation questions. The facilitation allows for diverse conversation on the topic and ensuring everyone has an opportunity to speak.

Open Season Discussion
The facilitated discussion is also called “Open Season” and uses the coloured index cards provided to each attendee.

  • Green: The New Stack card signals the facilitator that you have a question or comment unrelated to the current discussion thread.
  • Yellow: The On Stack card signals the facilitator that you have a question or comment that relates to the current thread of discussion
  • Red: The Burning Issue card is only to be used when you are urgently compelled to interrupt a speaker. It can be a point-of-order, an argument, a problem with facility acoustics, or something you need to say quickly because you have been provoked in a meaningful way. If you use your red card, the facilitator may confiscate it for the remainder of the conference – so use it wisely.

More information on facilitation and k-cards can be found here: http://testingthoughts.com/blog/26

Our facilitator for Tasting Let’s Test is Peter “Simon” Schrijver.

Peter is a very experienced all-round tester, who has worked since 1997 as tester, test coordinator and test manager. He has several years of experience using SBTM as a test approach. Since 2005, Peter works as an independent consultant. He visits annually at least two conferences and two training sessions to keep his knowledge up to date and where necessary, broaden/deepen his knowledge. Recently Peter started speaking at conferences as well. Peter is also an active member of TestNet and co-founder of the Dutch Exploratory Workshop on Testing. In these communities of enthusiastic testers he is active with peers and discuss with them on the testing profession to keep up to date and improve themselves. Peter is a very experienced facilitator.

Let's Test Conferences on Context-Driven Testing - For Testers, By Testers

When we say ”for testers, by testers” we mean that our main goal for these conferences is to make them a valuable experience for all participants, not to maximize profit. We are ourselves part of a team made up of serious, passionate and professional testers that back in 2011 decided that it was time to set up a context-driven testing conference in Europe. Since the inaugural Let’s Test conference in 2012, some team members have left and others have been added, and the conference has expanded to been organized in Australia as well as in Europe. We’re happy to see the Let’s Test family grow, but regardless of where you visit a Let’s Test conference, you can be sure that we’ll never compromise on the “for testers, by testers” principle.