Jump to content

Recommended Posts

It wouldn't be that hard to make a basic framework that does what you want, and I'd even be interested in hearing more and helping out.  That said I think the reason why, say, Simpletest solely reports success with a single line is because it's not really important what passes as long as it all passes.  The goal of the tests are really to comfort the developer that what he expects to happen is happening, and I can't think of a reason off the top of my head why he would want to know more specific details.

 

But let's talk more about the benefits of seeing a per pass results page, and how to go about implementing that.

It wouldn't be that hard to make a basic framework that does what you want,

 

I realized that, and am already well underway. Basic functionality is as good as finished.

 

That said I think the reason why, say, Simpletest solely reports success with a single line is because it's not really important what passes as long as it all passes.

 

You can spare yourself the cautiousness of that statement, it bluntly says so on their (his) website:

You may not like the rather minimal style of the display. Passes are not shown by default because generally you do not need more information when you actually understand what is going on. If you do not know what is going on then you should write another test.

 

I wholeheartedly disagree with that statement.

 

But let's talk more about the benefits of seeing a per pass results page, and how to go about implementing that.

 

If a test passes, it passes. The information provided by simpletest (the name of the test and the fact that it passed) is sufficient in that case. But when a test fails, I want to know immediately what assertions failed and which passed. I don't want to have go back to the testcode, then mend the tested code. I should be able to dive right into the code to be mended.

 

From another perspective, one might say I've not written my test very well if the message accompanying a failed assertion doesn't tell me what I need to know. Then I still say a test framework should assist me in improving the testcode, this can be done by giving me a clear overview of what assertions did pass in a failing test.

 

I've made a very simple implementation to accomplish this. Basically, in the concrete testcases, when you add an assertion, you get the opportunity to add a pass-message, as well as a fail message... I might build a little on it to print out the code of the failed assertion.

<?php
abstract class Backbone_UnitTest_TestCase {

private $tests = array();
private $runningTestName;

public static function run(){
	$testcase = new Backbone_UnitTest_TestCase;
	$testcase->setUp();
	$testMethods = $testcase->getTestMethods();
	foreach($testMethods as $methodName){

		$this->runningTestName = $methodName;
		$test = new Backbone_UnitTest_Test($methodName, $this);
		$this->tests[$methodName] = $test;
		$test->execute();
	}
	echo Backbone_UnitTest_Renderer::render($this->tests);
	$testcase->tearDown();
}
private function getTestMethods(){
	$refl = new ReflectionObject($this);

	$methods = array();
	foreach($refl->getMethods() as $reflMethod){
		if(substr($reflMethod->getName(), 0, 4) == 'test'){
			$methods[] = $reflMethod->getName();
		}
	}
}
public function assertTrue($bool, $failMsg = '', $passMsg = ''){
	if(!$bool == true){
		throw new Backbone_UnitTest_FailedAssertion($msg);
	}
	$this->tests[$runningTestName]->addAssertion($passMsg);
}
...
}?>

Are you certain simpletest doesn't allow this?  It has a method pass($message); but I'm not sure what it does, as I've never had any luck getting it to do anything.

 

That said I'm still not 100 percent sure what you're looking for.  It sounded like you wanted to know which test/line failed, but that's already provided, so you must have meant something different.

 

I think writing messages for failed cases and having them print failure might be useful, but messages for passed cases may be too much.  If you're running hundreds of tests at once and one fails you don't want to see hundreds of messages -- you'll lose the important one in all the noise.

 

The only thing I see really being a bother with adding a "failure message" to tests is that it'd be a lot of extra stuff to write, and very easy to just skip it.  Is there any way to automate the messages?  Surely most of the messages will just say something that is already contained in the test name and arguments?

I think writing messages for failed cases and having them print failure might be useful, but messages for passed cases may be too much.  If you're running hundreds of tests at once and one fails you don't want to see hundreds of messages -- you'll lose the important one in all the noise.

 

Only pass-messages of passed assertions within failed tests would print.

This thread is more than a year old. Please don't revive it unless you have something important to add.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.